You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is the bug?
Sending a chat completion request to LM Studio for firefunction-v2, including Tools results in
Error: received prediction-error t.fromSerializedError (/Applications/LM Studio.app/Contents/Resources/app/.webpack/main/index.js:382:62222) at m.o (/Applications/LM Studio.app/Contents/Resources/app/.webpack/main/index.js:25:155460) at m.emit (node:events:519:28) at m.onChildMessage (/Applications/LM Studio.app/Contents/Resources/app/.webpack/main/index.js:25:140859) at m.onChildMessage (/Applications/LM Studio.app/Contents/Resources/app/.webpack/main/index.js:25:164118) at ForkUtilityProcess.<anonymous> (/Applications/LM Studio.app/Contents/Resources/app/.webpack/main/index.js:25:140194) at ForkUtilityProcess.emit (node:events:519:28) at ForkUtilityProcess.a.emit (node:electron/js2c/browser_init:2:71823) - Caused By: SyntaxError: Unknown statement type: Identifier at /Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:33:115985 at i (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:33:116050) at R (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:33:120440) at new se (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:33:137373) at /Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:10:36458 at async t.formatPrompt (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:10:34152) at async t.getContextOverflowPolicy (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:10:49555) at async t.temp_createLlamaPredictionArgs (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:8:366412) at async t.LLMEngineWrapper.predictTokens (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:10:26226) at async Object.predictTokens (/Applications/LM Studio.app/Contents/Resources/app/.webpack/lib/llmworker.js:15:7181)
Screenshots
Additional Info
For reference: tool calling works ok using the same model and ollama.
firefunction-v2 is a fine tune of Llama 3 to support multiple tool calls in parallel by making one JSON with multiple function calls.
The text was updated successfully, but these errors were encountered:
Which version of LM Studio?
LM Studio 0.3.6
Which operating system?
macOS
What is the bug?
Sending a chat completion request to LM Studio for
firefunction-v2
, including Tools results inScreenshots
Additional Info
For reference: tool calling works ok using the same model and ollama.
firefunction-v2
is a fine tune of Llama 3 to support multiple tool calls in parallel by making one JSON with multiple function calls.The text was updated successfully, but these errors were encountered: