Deps version conflict causes unavailability of using HuggingFace models
🐛 Describe the bug
Description
@huggingface/inference dependency is outdated, so it causes this error:
project/node_modules/@llm-tools/embedjs-huggingface/node_modules/@langchain/community/dist/embeddings/hf.js:1
import { InferenceClient, } from "@huggingface/inference";
^^^^^^^^^^^^^^^
SyntaxError: The requested module '@huggingface/inference' does not provide an export named 'InferenceClient'
at #_instantiate (node:internal/modules/esm/module_job:254:21)
at async ModuleJob.run (node:internal/modules/esm/module_job:363:5)
at async onImport.tracePromise.__proto__ (node:internal/modules/esm/loader:683:26)
at async asyncRunEntryPointWithESMLoader (node:internal/modules/run_main:101:5)
Node.js v24.7.0
I tried to manually set it to latest version inside embedjs-huggingface module and error disappears, but there is another problem:
project/node_modules/@llm-tools/embedjs-huggingface/node_modules/@llm-tools/embedjs-interfaces/src/interfaces/base-model.js:57
if (!(await BaseModel.store.hasConversation(conversationId))) {
^
TypeError: Cannot read properties of undefined (reading 'hasConversation')
at HuggingFace.query (project/node_modules/@llm-tools/embedjs-huggingface/core/embedjs-interfaces/src/interfaces/base-model.ts:86:41)
at RAGApplication.query (project/node_modules/@llm-tools/embedjs/src/core/rag-application.js:361:27)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async <anonymous> (project/src/index.ts:33:13)
Node.js v24.7.0
Information to reproduce
package.json:
{
"name": "rag",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"dev": "tsx watch --include src/* src/index.ts"
},
"keywords": [],
"author": "",
"license": "ISC",
"type": "module",
"devDependencies": {
"@types/node": "^24.8.1",
"typescript": "^5.9.3"
},
"dependencies": {
"@huggingface/inference": "^4.11.3",
"@llm-tools/embedjs": "^0.1.29",
"@llm-tools/embedjs-hnswlib": "^0.1.29",
"@llm-tools/embedjs-huggingface": "^0.1.29",
"@llm-tools/embedjs-lmdb": "^0.1.29",
"@llm-tools/embedjs-loader-web": "^0.1.29",
"tsx": "^4.20.6"
}
}
Code to reproduce:
import path from "path";
import { loadEnvFile } from "process";
import { RAGApplicationBuilder } from "@llm-tools/embedjs";
import { HNSWDb } from "@llm-tools/embedjs-hnswlib";
import { WebLoader } from "@llm-tools/embedjs-loader-web";
import { LmdbStore } from "@llm-tools/embedjs-lmdb";
import {
HuggingFace,
HuggingFaceEmbeddings,
} from "@llm-tools/embedjs-huggingface";
loadEnvFile();
const app = await new RAGApplicationBuilder()
.setModel(new HuggingFace({ modelName: "IlyaGusev/saiga_llama3_8b" }))
.setEmbeddingModel(
new HuggingFaceEmbeddings({
apiKey: process.env.HUGGINGFACEHUB_API_KEY!,
model: "sentence-transformers/all-MiniLM-L6-v2",
}),
)
.setVectorDatabase(new HNSWDb())
.setStore(new LmdbStore({ path: path.resolve("./store") }))
.build();
await app.addLoader(
new WebLoader({
urlOrContent:
"https://ru.wikipedia.org/wiki/%D0%99%D0%BE%D1%80%D1%83%D0%B1%D0%B0_(%D1%8F%D0%B7%D1%8B%D0%BA)",
}),
);
console.log(await app.query("Расскажи о языке народа Йоруба"));
This issue is stale because it has been open for 14 days with no activity.
I have updated the package versions in the newest version (0.1.30). Could you let me know if that addresses the issue you face?
This issue is stale because it has been open for 14 days with no activity.
Hi! New release solved the problem. Thank you, your work gives you the pleasant feeling that you don't need to use Python to work with LLM. It's very useful, and that's what's really missing in Node.js