top of page

GPT-5 Brings New Power — and New Risks

  • Team UPG IT
  • Aug 31
  • 1 min read

The latest version of ChatGPT, GPT-5, can now connect directly to your apps to send emails and manage your calendar. It’s powerful and convenient, but also comes with serious risks.


One major concern is prompt injection. This happens when GPT-5 unknowingly follows hidden instructions from a website, document, or chat message. It could send data to the wrong place, click through to fake websites, or even forward sensitive information — often without you realising.


Another risk is data storage. Information you type into GPT-5, such as personal details or company files, may be stored and reused to train future models. There’s also the possibility that data could be shared unintentionally with third parties.


For advanced features like API integrations, you may be asked to upload an ID and a selfie to verify your identity. While these services promise strong security, storing sensitive documents always carries a risk of breaches or misuse.


What you should do.

  • Limit access: only allow GPT-5 to use the files or mailboxes you really need, and disconnect when done.

  • Approve actions: ask GPT-5 to show you a list of planned steps and confirm them one by one.

  • Avoid sharing sensitive data: never input passwords or confidential company information.

Comments


bottom of page