PinnedPublished inSystem Weakness3 Steps to protect yourself from Prompt InjectionHead over to https://defender.safetorun.com to quickstart!May 1May 1
How to test the safety and security of LLM appsI've written a guide on essentially how to test LLM apps for security and safety. Looking forward to hearing what you think!Dec 19Dec 19
Github co-pilot Edits to update multiple filesI was really excited to see the latest feature added to github co-pilot for VSCode albiet in preview.Nov 8Nov 8
Genkit (VertexAI) — Crashing with permission error and how to fix itI kept getting an error that looks something like thisOct 5Oct 5
A Quieter revolution — subtler ways to use generative AI to change how we do product engineeringA failure of imaginationSep 20Sep 20
The best attacks and defences against prompt injectionA framework for evaluation of attacks and defencesMay 10May 10
Risks and RiddlesThe new security battlegrounds of applications using ChatGPTDec 13, 2023Dec 13, 2023