Skip to content

Commit

Permalink
Ensure that RAG state is disabled at change of LLM inference-model
Browse files Browse the repository at this point in the history
Mimic the "switchTab" logic of ensuring the correct RAG state.

Signed-off-by: marijnvg-tng <[email protected]>
  • Loading branch information
marijnvg-tng committed Dec 19, 2024
1 parent e074eda commit 78a12d0
Showing 1 changed file with 10 additions and 0 deletions.
10 changes: 10 additions & 0 deletions WebUI/src/App.vue
Original file line number Diff line number Diff line change
Expand Up @@ -142,10 +142,12 @@ import {useTheme} from "./assets/js/store/theme.ts";
import AddLLMDialog from "@/components/AddLLMDialog.vue";
import WarningDialog from "@/components/WarningDialog.vue";
import {useBackendServices} from "./assets/js/store/backendServices.ts";
import {useTextInference} from "@/assets/js/store/textInference.ts";
const backendServices = useBackendServices();
const theme = useTheme();
const globalSetup = useGlobalSetup();
const textInference = useTextInference()
const enhanceCompt = ref<InstanceType<typeof Enhance>>();
const answer = ref<InstanceType<typeof Answer>>();
Expand Down Expand Up @@ -254,6 +256,14 @@ function switchTab(index: number) {
}
}
watch(textInference, (newSetting, oldSetting) => {
if (newSetting.backend === 'LLAMA.CPP') {
answer.value!.disableRag();
} else {
answer.value!.restoreRagState();
}
})
function miniWindow() {
window.electronAPI.miniWindow();
}
Expand Down

0 comments on commit 78a12d0

Please sign in to comment.