News

OpenAI Suchir Balaji: OpenAI responds to former researcher Suchir Balaji’s death


OpenAI released a statement on Thursday regarding the death of its former researcher Suchir Balaji. In a statement issued by the organisation, OpenAI confirmed that they have been in contact with Balaji’s family, offering their full support.

“We were devastated to learn of this tragic news and have been in touch with Suchir’s family to offer our full support during this difficult time,” the company stated. “Our priority is to continue to do everything we can to assist them.”

Balaji, who raised concerns about OpenAI breaking copyright law, was found dead in his San Francisco apartment earlier this month. Balaji had been vocal about the AI industry’s practices, particularly the use of copyrighted content to train generative AI (GenAI) models.

His whistleblowing gained traction amid a wave of lawsuits by writers, programmers and journalists alleging that OpenAI illegally used their content to develop ChatGPT, a popular chatbot now used by millions globally. OpenAI said that it first became aware of Balaji’s concerns when his comments were published in The New York Times (NYT).

According to the company, there has been no further interaction since then.

Discover the stories of your interest

In an interview with the NYT, Balaji shared that he had initially been unaware of the complexities of copyright law but developed a strong interest in the subject as he observed a growing number of lawsuits being filed against AI companies.

In a widely shared post on X, he said, “I recently participated in a NYT story about fair use and generative AI, and why I’m sceptical ‘fair use’ would be a plausible defence for a lot of generative AI products. I initially didn’t know much about copyright, fair use, etc., but became curious after seeing all the lawsuits filed against GenAI companies.”

Tesla and SpaceX CEO Elon Musk, who is in a legal battle with OpenAI CEO Sam Altman, responded to the news of Balaji’s death with a cryptic ‘hmm’ post on X.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.