We had this wait on ollama but not for openai, causing it to fail to scroll down in certain cases. This fixes the issue.
@@ -517,6 +517,10 @@
const sendPromptOpenAI = async (model, userPrompt, responseMessageId, _chatId) => {
const responseMessage = history.messages[responseMessageId];
+
+ // Wait until history/message have been updated
+ await tick();
scrollToBottom();
const docs = messages
@@ -527,6 +527,10 @@