World News Intel

Overall, 54% of Americans say artificial intelligence programs that generate text and images, like ChatGPT and DALL-E, need to credit the sources they rely on to produce their responses. A much smaller share (14%) says the programs don’t need to credit sources, according to a new Pew Research Center survey. About a third say they’re not sure on this question.

Pew Research Center published this analysis as part of its ongoing work to understand attitudes about artificial intelligence. This analysis draws on a survey of 10,133 U.S. adults conducted from Feb. 7 to 11, 2024.

Everyone who took part in the survey is a member of the Pew Research Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way, nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the ATP’s methodology.

Here are the questions used for the analysis, along with responses, and its methodology.

A separate Pew Research Center analysis finds growing public engagement with ChatGPT – one of the most well-known examples of generative AI – especially among young people.

Generative AI programs work by reviewing large amounts of information, such as the works of an artist or news organization. That allows them to generate responses when users ask questions.

This process has spurred lawsuits from authors, artists and news organizations, who argue that this is an unauthorized use of copyrighted material. But some technology companies argue that this is fair use under copyright law and that the programs provide a clear public benefit.

Our survey finds that the public consistently says AI programs should credit sources across seven examples of content they could generate.

For instance, 75% say AI programs should have to credit the sources they rely on if they provide information that matches what a journalist wrote nearly word-for-word. Just 6% say they shouldn’t have to credit their sources in this scenario, while 19% say they’re not sure.

Majorities of U.S. adults (67% each) also see a need for crediting sources if AI programs generate images that imitate the style of a current artist or text that imitates the style of a current author.

Whether an author is living or dead has little impact on public attitudes: 65% say credit is needed if AI programs imitate the writing style of a famous author who died many years ago.

Similarly, about six-in-ten say generative AI programs should have to credit the sources they rely on if they draft a movie script in the style of a popular movie. Hollywood screenwriters recently secured limits on using AI in script writing, as part of a larger labor agreement.

The view that credit is needed also extends to more general types of information. For instance, 60% of Americans say AI programs should have to credit the sources they use if they summarize information about the U.S. population. And 61% say credit is needed if these programs provide information that was reported by many different news organizations.

How often do Americans think they interact with AI?

Over the years, Center surveys have explored public views on multiple aspects of artificial intelligence, including overall awareness of and engagement with these technologies.

Our new survey finds that 22% of Americans say they interact with artificial intelligence almost constantly or several times a day. Another 27% say they interact with AI about once a day or several times a week. Half of Americans think they interact with AI less often.

Adults with higher levels of education are more likely than those with less education to say they interact with AI frequently. For instance, 63% of postgraduates and 57% of college graduates say they interact with AI at least several times a week. That compares with 50% of those with some college education and 36% of those with a high school diploma or less education.

Younger Americans also are more likely than their older peers to say they interact with AI often. Majorities of those ages 18 to 29 (56%) and 30 to 49 (54%) say they interact with AI at least several times a week. Smaller shares of those ages 50 to 64 (46%) and 65 and older (37%) say the same.

While AI now powers many widely used functions – like personalized online shopping recommendations – its presence may not always be visible to all Americans. For instance, only 30% of U.S. adults correctly identify the presence of AI across six examples in a recent survey about AI awareness. 

Note: Here are the questions used for the analysis, along with responses, and its methodology.

Alec Tyson  is an associate director of research at Pew Research Center.
Brian Kennedy  is a senior researcher focusing on science and society research at Pew Research Center.

enterprisersproject

Share.
Leave A Reply

Exit mobile version

Subscribe For Latest Updates

Sign up to best of business news, informed analysis and opinions on what matters to you.
Invalid email address
We promise not to spam you. You can unsubscribe at any time.
Thanks for subscribing!