×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

'How to make a bomb?' Hacker jailbreaks ChatGPT's safety policy to get detailed guide on making explosives at home

The hacker gave connecting prompts to the bot, deceiving it into violating its pre-programmed restrictions, a process which is called 'jailbreaking'.
Last Updated : 16 September 2024, 08:29 IST

Follow Us :

Comments
ADVERTISEMENT
Published 16 September 2024, 08:29 IST

Follow us on :

Follow Us