Parekh’s
Law of Chatbots … ……………………………..25 Feb 2023
Clause A :
( A )
# Answers
being delivered by AI Chatbot must not be “ Mis-informative /
Malicious /
Slanderous / Fictitious / Dangerous / Provocative / Abusive /
Arrogant /
Instigating / Insulting / Denigrating humans etc
Context :
AP,
other news organisations develop standards for use of AI in newsrooms …. ET
/ 18 Aug 2023
Extract :
The Associated Press has issued guidelines on
artificial intelligence, saying :
Ø the tool cannot be used
to create publishable content and images for the news service while
encouraging staff members to become familiar with the technology.
AP is one of a handful
of news organisations that have begun to set rules on how to integrate
fast-developing tech tools like ChatGPT into their work. The service will
couple this on Thursday with a chapter in its influential Stylebook that
advises journalists how to cover the story, complete with a glossary of
terminology.
"Our goal is to
give people a good way to understand how we can do a little experimentation but
also be safe,"
said Amanda Barrett, vice president of news standards and inclusion at AP.
The journalism think
tank Poynter Institute, saying it was a
" transformational moment, " urged news
organisations this spring to create standards for AI's use, and share the policies with
readers and viewers.
Generative AI has the ability to create text,
images, audio and video on command, but isn't yet fully capable of
distinguishing between fact and fiction. As a result, AP said material produced by artificial
intelligence should be vetted carefully, just like material from any other
news source
Similarly, AP said a photo, video or audio segment generated by AI should not be used,
unless the altered material is itself the subject of a story.
That's in line with the tech magazine Wired,
which said it does not publish stories
generated by AI, "except when the
fact that it's AI-generated is the point of the whole story."
"Your stories
must be completely written by you," Nicholas Carlson, Insider
editor-in-chief, wrote in a note to employees that was shared with readers.
"You are responsible for
the accuracy, fairness, originality and quality of every word in your
stories."
Highly-publicised
cases of AI-generated "hallucinations,"
or made-up facts, make it important that consumers know that standards are in
place to "make
sure the content they're reading, watching and listening to is verified, credible and as fair as possible," Poynter said
in an editorial.
with regards,
hemen Parekh
www.hemenparekh.ai /
18 Aug 2018
Related Readings :
No comments:
Post a Comment