Thursday, July 3, 2025

Taken with Transportation Podcast: Tip of the Cap (and Gown) to Our Dads and Grads

Taken with Transportation Podcast: Tip of the Cap (and Gown) to Our Dads and Grads
By Melissa Culross

Parking Control Officers Lloyd Glover, Willie Moore and Harold Laughlin are among the dads we profile in our new podcast episode. A lot of things happen as spring ends and summer begins. The days get longer. The weather warms up, in most places, anyway. As we know, San Francisco has its own definition of “summer weather.” And we celebrate dads and grads. In honor of Father’s Day and graduation season, we are profiling some of our own dads and grads in a special episode of our Taken with Transportation podcast. “Tip of the Cap (and Gown) to Our Dads and Grads” features eight members of our...



Published July 02, 2025 at 05:30AM
https://ift.tt/5K7tPWN

Wednesday, July 2, 2025

Show HN: Core – open source memory graph for LLMs – shareable, user owned https://ift.tt/2eL9jK6

Show HN: Core – open source memory graph for LLMs – shareable, user owned I keep running in the same problem of each AI app “remembers” me in its own silo. ChatGPT knows my project details, Cursor forgets them, Claude starts from zero… so I end up re-explaining myself dozens of times a day across these apps. The deeper problem 1. Not portable – context is vendor-locked; nothing travels across tools. 2. Not relational – most memory systems store only the latest fact (“sticky notes”) with no history or provenance. 3. Not yours – your AI memory is sensitive first-party data, yet you have no control over where it lives or how it’s queried. Demo video: https://youtu.be/iANZ32dnK60 Repo: https://ift.tt/3S58tw7 What we built - CORE (Context Oriented Relational Engine): An open source, shareable knowledge graph (your memory vault) that lets any LLM (ChatGPT, Cursor, Claude, SOL, etc.) share and query the same persistent context. - Temporal + relational: Every fact gets a full version history (who, when, why), and nothing is wiped out when you change it—just timestamped and retired. - Local-first or hosted: Run it offline in Docker, or use our hosted instance. You choose which memories sync and which stay private. Try it - Hosted free tier (HN launch): https://core.heysol.ai - Docs: https://ift.tt/SyWLlmf https://ift.tt/3S58tw7 July 1, 2025 at 09:54PM

Show HN: Lifp – A Lisp Built on Bun https://ift.tt/Z5vArWB

Show HN: Lifp – A Lisp Built on Bun A silly summer break project where I played around with Bun, which I hadn't had the chance to yet. It's not super lisp-y, there's no car, no cdr, nor macros. It's a Lisp-Inspired Functional Programming language that ideally doesn't require a paradigm shift when you come from C-like languages. https://ift.tt/8M9fTlV July 1, 2025 at 08:04PM

Tuesday, July 1, 2025

Show HN: C.O.R.E – Opensource, user owned, shareable memory for Claude, Cursor https://ift.tt/hn326jt

Show HN: C.O.R.E – Opensource, user owned, shareable memory for Claude, Cursor Hi HN, I keep running in the same problem of each AI app “remembers” me in its own silo. ChatGPT knows my project details, Cursor forgets them, Claude starts from zero… so I end up re-explaining myself dozens of times a day across these apps. The deeper problem 1. Not portable – context is vendor-locked; nothing travels across tools. 2. Not relational – most memory systems store only the latest fact (“sticky notes”) with no history or provenance. 3. Not yours – your AI memory is sensitive first-party data, yet you have no control over where it lives or how it’s queried. Demo video: https://youtu.be/iANZ32dnK60 Repo: https://ift.tt/3S58tw7 What we built - CORE (Context Oriented Relational Engine): An open source, shareable knowledge graph (your memory vault) that lets any LLM (ChatGPT, Cursor, Claude, SOL, etc.) share and query the same persistent context. - Temporal + relational: Every fact gets a full version history (who, when, why), and nothing is wiped out when you change it—just timestamped and retired. - Local-first or hosted: Run it offline in Docker, or use our hosted instance. You choose which memories sync and which stay private. Why this matters - Ask “What’s our roadmap now?” and “What was it last quarter?” — timeline and authorship are always preserved. - Change a preference (e.g. “I no longer use shadcn”) — assistants see both old and new memory, so no more stale facts or embarrassing hallucinations. - Every answer is traceable: hover a fact to see who/when/why it got there. Try it - Hosted free tier (HN launch): https://core.heysol.ai - Docs: https://ift.tt/SyWLlmf https://ift.tt/3S58tw7 July 1, 2025 at 02:40AM

Show HN: Audiopipe – Pipeline for audio diarization, denoising and transcription https://ift.tt/wQkI7Jl

Show HN: Audiopipe – Pipeline for audio diarization, denoising and transcription Audiopipe is a one-liner for denoising, diarization and transcription with demucs + pyannote + insanely-fast-whisper. Made it to transcribe podcasts and Dungeons And Dragons sessions, figured it could be useful. It also has a wasm UI to load transcriptions and audio. Feel free to contribute! Feedback appreciated. https://ift.tt/zt4YUMa July 1, 2025 at 01:02AM

Show HN: We're two coffee nerds who built an AI app to track beans and recipes https://ift.tt/kRhoS4F

Show HN: We're two coffee nerds who built an AI app to track beans and recipes It’s available on iOS now: https://ift.tt/k3Fw2DN We got into specialty coffee during COVID and, like many others, fell deep down the rabbit hole. Along the way, we ran into the same frustrations: - A drawer full of empty coffee bags. - No simple way to track grind size, rest dates, notes—by bean. - My coffee history scattered across photos, screenshots, notebooks, and half-memories. - The unique traits, people, and stories behind each coffee disappearing from the internet once it sold out (since coffee is an agricultural good) - In our opinion, no coffee tool really captures the flavor, emotion, and aesthetic of great coffee—from a design perspective. So we built BeanBook—a coffee notebook log beans, extract recipes, and organize your coffee life in one place with just a snap, powered by AI Here’s what it does: - Snap a bag → Auto-detects roaster, origin, process, roast date, notes, producer, farm, and more - Paste a YouTube link or photo → Extracts a structured recipe automatically - Log grind size, roast timeline, ratings & notes → All saved in a clean, elegant UI - See your coffee year in review → Track habits, trends, and favorites - Ask BeanBook AI → From brew temps to bean facts, get instant answers My co-founder and I built everything ourselves—branding, code, and UX design. If you’re into coffee (or trying to get more into it), we’d love your feedback. - Rokey & Eric https://beanbook.app July 1, 2025 at 12:08AM

Show HN: Attach Gateway – one-command OIDC/DID auth for local LLMs https://ift.tt/YI3x9Gt

Show HN: Attach Gateway – one-command OIDC/DID auth for local LLMs We’ve been building local and on-prem agent workflows for open-source LLMs. Engines like Ollama or vLLM ship with no auth, so every team ends up writing the same JWT proxy. Attach Gateway is a single process that sits in front of any model server and handles the boring bits: - verifies OIDC / DID JWTs - adds X-Attach-User and X-Attach-Session headers so downstream agents share the same identity - optional /a2a/tasks/send endpoint for Google-style A2A hand-offs - mirrors prompts + completions to Weaviate (runs in Docker) One `pip install attach-dev`, export a token, run `attach-gateway`, and your local Ollama is protected in under 60 seconds. Repo: https://ift.tt/kYuNLF8 PyPI: https://ift.tt/EFlSuRC Would love feedback – especially from teams doing multi-agent or on-prem work. https://ift.tt/kYuNLF8 June 30, 2025 at 11:38PM

Show HN: Nocturne – Your Car Thing's Second Chapter https://ift.tt/Xf2ojAy

Show HN: Nocturne – Your Car Thing's Second Chapter Hello HN! Recently, we have released Nocturne 3.0.0, which is a complete replacement...