From 418d50d9f84c2be647317938219f5f63b5a678f5 Mon Sep 17 00:00:00 2001 From: Georgi Gerganov Date: Wed, 3 Apr 2024 20:45:59 +0300 Subject: [PATCH] fix --- SECURITY.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/SECURITY.md b/SECURITY.md index b94becbac..b1e5576bb 100644 --- a/SECURITY.md +++ b/SECURITY.md @@ -1,6 +1,6 @@ # Security Policy - - [**Using llama.cpp securely**](#using-llama-cpp-securely) + - [**Using llama.cpp securely**](#using-llamacpp-securely) - [Untrusted models](#untrusted-models) - [Untrusted inputs](#untrusted-inputs) - [Data privacy](#data-privacy) @@ -57,7 +57,7 @@ If you intend to run multiple models in parallel with shared memory, it is your ## Reporting a vulnerability -Beware that none of the topics under [Using llama.cpp securely](#using-llama-cpp-securely) are considered vulnerabilities of LLaMA C++. +Beware that none of the topics under [Using llama.cpp securely](#using-llamacpp-securely) are considered vulnerabilities of LLaMA C++. However, If you have discovered a security vulnerability in this project, please report it privately. **Do not disclose it as a public issue.** This gives us time to work with you to fix the issue before public exposure, reducing the chance that the exploit will be used before a patch is released.