You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In development scenarios where the inference endpoint is being called over a shared network rather than on localhost, it is thought reasonable to secure the connection with self-signed SSL certificates.
Would be nice to see such option here, if it were deemed appropriate to cater to such use case.
Speaking personally, this comes from a risks-acknowledged position. A friendly reminder not to use insecure connections across untrusted networks would be customary though for such "Skip Certificate Verification" toggles in UI if implemented.
The text was updated successfully, but these errors were encountered:
Then you can use the --ssl-key-file and --ssl-cert-file CLI args to specify a certificate.
Thanks, that is good to know. Though, I currently have a reverse proxy taking care of SSL, and the issue is that llama.vscode extension does not have an option to allow self-signed certificates.
In development scenarios where the inference endpoint is being called over a shared network rather than on localhost, it is thought reasonable to secure the connection with self-signed SSL certificates.
Would be nice to see such option here, if it were deemed appropriate to cater to such use case.
Speaking personally, this comes from a risks-acknowledged position. A friendly reminder not to use insecure connections across untrusted networks would be customary though for such "Skip Certificate Verification" toggles in UI if implemented.
The text was updated successfully, but these errors were encountered: