I Built a Real-Time Artemis II 3D Tracker in One Session — Here's the Engineering Pipeline That Made It Possible
On April 1, 2026, four astronauts launched aboard Orion on Artemis II — humanity's first crewed voyage beyond low Earth orbit since Apollo 17 in 1972. I wanted to track it. Not on a static NASA pag...

Source: DEV Community
On April 1, 2026, four astronauts launched aboard Orion on Artemis II — humanity's first crewed voyage beyond low Earth orbit since Apollo 17 in 1972. I wanted to track it. Not on a static NASA page. Not on someone else's stream overlay. I wanted an interactive 3D visualization with real telemetry, in my browser, that I built myself. Six hours - one afternoon - later, I had one. Live at artemis-tracker-murex.vercel.app. 47 files. ~8,000 lines of TypeScript. 15 unit tests. 5 serverless API proxies. Degree-8 Lagrange interpolation at 60fps. An AI mission chatbot. Deep Space Network status. Deployed on Vercel. Built in a single session using Claude Code with a structured engineering pipeline called Wrought. This post isn't about "look what AI can do." It's about what happens when you give an AI agent engineering discipline instead of just a prompt. What the App Does ARTEMIS is a real-time 3D mission tracker that combines three NASA data sources into one interactive visualization: OEM Ephe