Show HN за 7 декабря 2025 г.
12 постовGeetanjali – RAG-powered ethical guidance from the Bhagavad Gita #
The problem: The Gita has 701 verses. Finding applicable wisdom for a specific situation requires either deep familiarity or hours of reading.
How it works: 1. User describes their ethical dilemma 2. Query is embedded using sentence-transformers 3. ChromaDB retrieves top-k semantically similar verses 4. LLM generates structured output: 3 options with tradeoffs, implementation steps, verse citations
Tech stack: - Backend: FastAPI, PostgreSQL, Redis - Vector DB: ChromaDB with all-MiniLM-L6-v2 embeddings - LLM: Ollama (qwen2.5:3b) primary, Anthropic Claude fallback - Frontend: React + TypeScript + Tailwind
Key design decisions: - RAG to prevent hallucination — every recommendation cites actual verses - Confidence scoring flags low-quality outputs for review - Structured JSON output for consistent UX - Local LLM option for privacy and zero API costs
What I learned: - LLM JSON extraction is harder than expected. Built a three-layer fallback (direct parse → markdown block extraction → raw_decode scanning) - Semantic search on religious texts works surprisingly well for ethical queries - Smaller models (3B params) work fine when constrained by good prompts and retrieved context
GitHub: https://github.com/geetanjaliapp/geetanjali
Happy to discuss the RAG architecture or take feedback.
Minimal container-like sandbox built from scratch in C #
Github: https://github.com/Sahilb315/runbox
Happy to hear feedback or suggestions.
AI Paul Graham #
* Nia gives coding agents accurate context by indexing entire codebases, documentation, and packages. It fixes hallucinations by letting agents retrieve real source information instead of guessing. Developers ship faster because AI can read and understand their actual project. This is a different use case of Nia but apparently it also works.
You can chat with it and ask any question because it has access to Nia’s knowledge base, which indexed all of his personal essays. The agent is able to call multiple tools that directly use Nia’s API:
• NiaWebSearch - searches the web • searchEssays - semantic search over all essays • browseEssays - shows the full tree of essays • listDirectory - lists essays in a path • readEssay - reads full essay content • grepEssays - regex pattern search • getSourceContent - retrieves full source by identifier
Models: anthropic/claude-sonnet-4.5 moonshotai/kimi-k2-thinking xai/grok-4-fast-reasoning alibaba/qwen3-vl-thinking
It’s fully free and open source.
Try it: https://www.paulgraham-nia.com/ Code: https://github.com/nozomio-labs/paulgraham-ai
OpenFret – Guitar inventory, AI practice, and a note-detection RPG #
What it does:
1) Smart inventory – Add your guitars, get auto-filled specs from ~1,000 models in the database. Track woods, pickups, tunings, string changes, photos.
2) AI practice sessions – Generate personalized tabs and lessons based on your practice history. Rendered with VexFlow notation.
3) Session Mode – Version-controlled music collaboration (think Git for audio). Fork tracks, add layers, see history, merge contributions.
4) Musical tools – Tuner, metronome, scale visualizer, chord progressions, fretboard maps. Last.fm integration for tracking what songs you're learning.
5) Guitar RPG – Fight monsters by playing real guitar notes. Web Audio API detects your playing. 300+ hand-crafted lessons from beginner to advanced.
What you can try without signing up: 1) The RPG demo is completely free, no account needed: https://openfret.com/game — just click "Start Battle" and play. It's capped at level 10 but gives you a real feel for the note detection.
The full platform (inventory, AI practice, sessions) requires Discord or magic link auth.
Current state: Beta. Core features work, actively adding content. The RPG has 300+ lessons done with more coming. Full game is $10 one-time, everything else is free.
Why I built it: I have a basement music setup and wanted one place to track when I last changed strings, get practice material that adapts to what I'm working on, and collaborate without DM'ing WAV/MP3 files.
Tech: Next.js (T3), Web Audio API for pitch detection, VexFlow for notation, Strudel integration for algorithmic backing tracks, Last.fm API.
Happy to answer questions about the AI tab generation, note detection, or the Git-style collaboration model.
A Markdown document manager in Rust #
I’m an ex-Amazon engineer. I built Seychl because I was tired of waiting 3 seconds for my notes to load in cloud-based apps.
Seychl is a local-first knowledge base designed for speed. UI interactions are always instant (<16ms).
Features for power users:
- Full keyboard control (never touch the mouse)
- Vim mode built-in
- Markdown storage (you own your data)
- Instant search across 10k+ notes
- Persistent Tmux-like sessions, windows and panes
It’s basically "Linear for knowledge management" – focusing on ergonomics and speed over bloated features.
You can download the binary here (currently MacOS only): https://github.com/Seychl/seychl-release/releases/download/0...
I was frustrated of 85% of my technical interviews, I built SharpSkill #
My name is Benjamin and as many developers, I was tired to not succeed my technical interviews, for any reasons.
I made SharpSkill in order to change that.
Real Use Case, flashcards & interview simulators.
One goal : Destroy the next technical interviews.
I made an AI tool that applies to jobs via cold email #
The tool extracts information from your resume, matches it with the job description, writes a personalised email in your style, and either finds the company email automatically or lets you set your own. It then sends the email with your CV attached through your own personal email account.
There are two modes: Rapid Mode uses LinkedIn search results, and CSV Mode lets you upload a list of jobs and apply to all of them in one go.
I originally built it for myself, but decided to share it in case it helps anyone else dealing with the same frustrations. Happy to answer questions or get feedback from the community.