Inside xAI and Future of Millions of Emulated Humans
- by NextBigFuture
- Jan 17, 2026
- 0 Comments
- 0 Likes Flag 0 Of 5
Brian Wang
Interview with Sulaiman Ghori, Member of Technical Staff at xAI. XAI will emulate millions of humans via the distributed Tesla chips. HW4 good enough
AI and XAI are hardware constrained and the biggest edge for XAI is speed of hardware deployment.
Predicting future bottlenecks
Elon excels at forecasting bottlenecks months and years ahead and working backwards. XAI Teams adopt this by focusing on core metrics (financial/physical). Eliminate perceived software limitations (latency, overhead) for 2–8× gains.
Macrohard model iterations happen daily and multiple times a day (even from pre-training). Hardware racks start training within hours (sometimes same day/few hours) of setup—vs. days/weeks elsewhere. Supercompute team removes typical barriers.
He joined XAI after startup attempts and outreach from Greg Yang.
Bootstrapping off the Tesla network
Potential to deploy millions of human emulators cheaply on idle Tesla vehicles (Hardware 4, networking/cooling/power already present). There are about 3 million HW4 cars in the USA. Pay owners to lease compute time—more capital-efficient than AWS/Oracle/NVIDIA hardware. Enables massive scale without new buildouts.
What is Macrohard
Digital equivalent of Optimus: emulate any keyboard/mouse/screen-based human digital task at fraction of cost + 24/7 uptime. No software adoption needed. Rollout will be slow at first, then rapid scaling (1,000 → 1M emulators not infrastructure-limited).
How Elon Deals with Problems or Fires
Immediate action lke phone calls to vendors for patches next day, side-by-side fixes.
Blockers resolved in hours/days.
Please first to comment
Related Post
New Tesla vision sounds almost too good to be true
- Feb 14, 2026
Elon Musk Has Changed His Mission Statement
- Feb 14, 2026
Stay Connected
Tweets by elonmuskTo get the latest tweets please make sure you are logged in on X on this browser.
Energy





