Tools: πŸŽ“ Science Teacher Chatbot β€” Full Explanation (Everything)

Tools: πŸŽ“ Science Teacher Chatbot β€” Full Explanation (Everything)

🌱 0️⃣ What We Are Building ## 🧠 1️⃣ How System Works ## 🧰 2️⃣ Technologies ## πŸ“¦ 3️⃣ Create Project ## πŸ“¦ 4️⃣ Install Dependencies ## πŸ“ 5️⃣ Folder Structure ## πŸ” 6️⃣ OpenAI Key ## πŸ’Ύ 7️⃣ memory.mjs ## What it does ## 🧠 8️⃣ llm.mjs ## What it does ## πŸ”¬ 9️⃣ filter.mjs ## What it does ## 🌐 1️⃣0️⃣ route.mjs ## What it does ## πŸš€ 1️⃣1️⃣ server.mjs ## What it does ## ▢️ 1️⃣2️⃣ Run Project ## πŸ§ͺ 1️⃣3️⃣ Test API ## 🧠 1️⃣4️⃣ Memory Demo ## ⚠️ 1️⃣5️⃣ Important Notes ## πŸ† 1️⃣6️⃣ What You Built (Hero Level) ## πŸŽ“ 1️⃣7️⃣ How to Explain to Students We are building a Science Teacher AI chatbot backend. Student: What is gravity? AI: Gravity is a force… Student: explain again AI: As I explained earlier… πŸ‘‰ AI remembers context. Add script in package.json POST β†’ http://localhost:3000/ask AI remembers context. βœ… LLM backend βœ… LangChain integration βœ… Memory chatbot βœ… Science filter βœ… Express API βœ… Teacher AI This is real AI app architecture. We built an AI teacher using OpenAI. Express receives student questions. LangChain connects AI and memory. Memory stores conversation. Filter ensures science-only answers. Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to ? It will become hidden in your post, but will still be visible via the comment's permalink. as well , this person and/or CODE_BLOCK: Student β†’ API β†’ Filter β†’ LangChain β†’ OpenAI β†’ Answer ↑ Memory CODE_BLOCK: Student β†’ API β†’ Filter β†’ LangChain β†’ OpenAI β†’ Answer ↑ Memory CODE_BLOCK: Student β†’ API β†’ Filter β†’ LangChain β†’ OpenAI β†’ Answer ↑ Memory CODE_BLOCK: mkdir science-teacher-bot cd science-teacher-bot pnpm init CODE_BLOCK: mkdir science-teacher-bot cd science-teacher-bot pnpm init CODE_BLOCK: mkdir science-teacher-bot cd science-teacher-bot pnpm init CODE_BLOCK: pnpm add express cors dotenv langchain @langchain/openai nodemon CODE_BLOCK: pnpm add express cors dotenv langchain @langchain/openai nodemon CODE_BLOCK: pnpm add express cors dotenv langchain @langchain/openai nodemon CODE_BLOCK: science-teacher-bot/ β”‚ β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ memory.mjs β”‚ β”œβ”€β”€ llm.mjs β”‚ β”œβ”€β”€ filter.mjs β”‚ β”œβ”€β”€ route.mjs β”‚ └── server.mjs β”‚ β”œβ”€β”€ .env └── package.json CODE_BLOCK: science-teacher-bot/ β”‚ β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ memory.mjs β”‚ β”œβ”€β”€ llm.mjs β”‚ β”œβ”€β”€ filter.mjs β”‚ β”œβ”€β”€ route.mjs β”‚ └── server.mjs β”‚ β”œβ”€β”€ .env └── package.json CODE_BLOCK: science-teacher-bot/ β”‚ β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ memory.mjs β”‚ β”œβ”€β”€ llm.mjs β”‚ β”œβ”€β”€ filter.mjs β”‚ β”œβ”€β”€ route.mjs β”‚ └── server.mjs β”‚ β”œβ”€β”€ .env └── package.json CODE_BLOCK: OPENAI_API_KEY=your_key_here PORT=3000 CODE_BLOCK: OPENAI_API_KEY=your_key_here PORT=3000 CODE_BLOCK: OPENAI_API_KEY=your_key_here PORT=3000 CODE_BLOCK: import { BufferMemory } from "langchain/memory"; export const memory = new BufferMemory({ returnMessages: true, memoryKey: "chat_history", }); CODE_BLOCK: import { BufferMemory } from "langchain/memory"; export const memory = new BufferMemory({ returnMessages: true, memoryKey: "chat_history", }); CODE_BLOCK: import { BufferMemory } from "langchain/memory"; export const memory = new BufferMemory({ returnMessages: true, memoryKey: "chat_history", }); CODE_BLOCK: import "dotenv/config"; import { ChatOpenAI } from "@langchain/openai"; import { ConversationChain } from "langchain/chains"; import { memory } from "./memory.mjs"; const llm = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0.3, }); const template = ` You are a science teacher. Answer only science questions. Explain in simple way. Conversation: {chat_history} Student: {input} Teacher: `; export const chatChain = new ConversationChain({ llm, memory, prompt: template, }); CODE_BLOCK: import "dotenv/config"; import { ChatOpenAI } from "@langchain/openai"; import { ConversationChain } from "langchain/chains"; import { memory } from "./memory.mjs"; const llm = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0.3, }); const template = ` You are a science teacher. Answer only science questions. Explain in simple way. Conversation: {chat_history} Student: {input} Teacher: `; export const chatChain = new ConversationChain({ llm, memory, prompt: template, }); CODE_BLOCK: import "dotenv/config"; import { ChatOpenAI } from "@langchain/openai"; import { ConversationChain } from "langchain/chains"; import { memory } from "./memory.mjs"; const llm = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0.3, }); const template = ` You are a science teacher. Answer only science questions. Explain in simple way. Conversation: {chat_history} Student: {input} Teacher: `; export const chatChain = new ConversationChain({ llm, memory, prompt: template, }); CODE_BLOCK: export function isScience(text) { const words = [ "physics","chemistry","biology", "atom","cell","energy","force", "gravity","plant","reaction", "photosynthesis","molecule" ]; return words.some(w => text.toLowerCase().includes(w) ); } CODE_BLOCK: export function isScience(text) { const words = [ "physics","chemistry","biology", "atom","cell","energy","force", "gravity","plant","reaction", "photosynthesis","molecule" ]; return words.some(w => text.toLowerCase().includes(w) ); } CODE_BLOCK: export function isScience(text) { const words = [ "physics","chemistry","biology", "atom","cell","energy","force", "gravity","plant","reaction", "photosynthesis","molecule" ]; return words.some(w => text.toLowerCase().includes(w) ); } COMMAND_BLOCK: import express from "express"; import { chatChain } from "./llm.mjs"; import { isScience } from "./filter.mjs"; export const router = express.Router(); router.post("/", async (req, res) => { const { text } = req.body; if (!text) { return res.json({ error: "Question required", }); } if (!isScience(text)) { return res.json({ answer: "I only answer science questions.", }); } const response = await chatChain.predict({ input: text, }); res.json({ answer: response }); }); COMMAND_BLOCK: import express from "express"; import { chatChain } from "./llm.mjs"; import { isScience } from "./filter.mjs"; export const router = express.Router(); router.post("/", async (req, res) => { const { text } = req.body; if (!text) { return res.json({ error: "Question required", }); } if (!isScience(text)) { return res.json({ answer: "I only answer science questions.", }); } const response = await chatChain.predict({ input: text, }); res.json({ answer: response }); }); COMMAND_BLOCK: import express from "express"; import { chatChain } from "./llm.mjs"; import { isScience } from "./filter.mjs"; export const router = express.Router(); router.post("/", async (req, res) => { const { text } = req.body; if (!text) { return res.json({ error: "Question required", }); } if (!isScience(text)) { return res.json({ answer: "I only answer science questions.", }); } const response = await chatChain.predict({ input: text, }); res.json({ answer: response }); }); COMMAND_BLOCK: import express from "express"; import cors from "cors"; import "dotenv/config"; import { router } from "./route.mjs"; const app = express(); app.use(cors()); app.use(express.json()); app.use("/ask", router); app.get("/", (req, res) => { res.send("Science Teacher AI running"); }); const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log("Server running on", PORT); }); COMMAND_BLOCK: import express from "express"; import cors from "cors"; import "dotenv/config"; import { router } from "./route.mjs"; const app = express(); app.use(cors()); app.use(express.json()); app.use("/ask", router); app.get("/", (req, res) => { res.send("Science Teacher AI running"); }); const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log("Server running on", PORT); }); COMMAND_BLOCK: import express from "express"; import cors from "cors"; import "dotenv/config"; import { router } from "./route.mjs"; const app = express(); app.use(cors()); app.use(express.json()); app.use("/ask", router); app.get("/", (req, res) => { res.send("Science Teacher AI running"); }); const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log("Server running on", PORT); }); CODE_BLOCK: "scripts": { "dev": "node src/server.mjs" } CODE_BLOCK: "scripts": { "dev": "node src/server.mjs" } CODE_BLOCK: "scripts": { "dev": "node src/server.mjs" } CODE_BLOCK: pnpm dev CODE_BLOCK: { "text": "What is photosynthesis?" } CODE_BLOCK: { "text": "What is photosynthesis?" } CODE_BLOCK: { "text": "What is photosynthesis?" } CODE_BLOCK: { "answer": "Photosynthesis is the process..." } CODE_BLOCK: { "answer": "Photosynthesis is the process..." } CODE_BLOCK: { "answer": "Photosynthesis is the process..." } - answers science questions - explains like teacher - remembers chat - ignores non-science - Student sends question - Express API receives - Science filter checks - LangChain adds memory - OpenAI generates answer - API returns response - Node.js β†’ backend - Express β†’ API - LangChain β†’ LLM framework - OpenAI β†’ AI brain - BufferMemory β†’ chat memory - dotenv β†’ API key - pnpm β†’ package manager - stores chat history - remembers messages - gives context to AI - creates AI model - defines teacher behavior - connects memory - creates chatbot - checks if question is science - blocks other topics - receives question - validates text - checks science - sends answer - creates server - enables JSON - mounts API route - starts backend - What is gravity - explain again - shared for all users - resets on restart - vector store