OpenAI Invests in Ultrasound-Based Brain-Computer Interfaces at Merge Labs

Researchers demonstrating ultrasound-based brain-computer interfaces research by Merge Labs with OpenAI collaboration

OpenAI Invests in Ultrasound-Based Brain-Computer Interfaces at Merge Labs

By Agustin Giovagnoli / January 15, 2026

Sam Altman has quietly co-founded Merge Labs, a research-driven neurotechnology startup developing an ultrasound-first, non-invasive brain–computer interface (BCI) tightly integrated with advanced AI systems. OpenAI is both investing and collaborating on research, signaling a strategic bet on ultrasound-based brain-computer interfaces that may enable thought-to-AI interactions for work and daily computing [1][2][3].

What Merge Labs Is Building: Ultrasound-based brain-computer interfaces

Merge champions “sensing over surgery,” focusing on read-only decoding of neural activity to route user intent or internal brain states into models like ChatGPT, with ambitions to later add write capabilities—all while minimizing or avoiding surgical implants [1][3][5]. Its platform pairs focused ultrasound with genetic and molecular tools, including gene-encoded acoustic reporters, to access deep brain tissue with broad coverage in a largely non-invasive way [1][3][5]. The near-term vision is a read-only brain-computer interface that could create a more natural input layer for AI rather than controlling cursors or prosthetics [1][3][5].

The science and leadership

The company’s scientific direction is anchored by Caltech biomolecular engineer Mikhail Shapiro, whose work underpins the use of focused ultrasound and engineered cells for targeted neuromodulation and sensing [1][3][5]. Peer-reviewed studies indicate that optimized ultrasound parameters can reliably excite or inhibit specific brain regions and alter behavior—with limited systemic side effects—supporting the feasibility of repeated deep-brain stimulation [7]. This evidence base strengthens the case that focused ultrasound neuromodulation can be tuned for precision and safety in non-invasive BCI applications [7].

Funding, Structure, and Strategic Backers

Merge Labs has raised around $252 million in seed funding. Backers include OpenAI, Bain Capital Ventures, and Valve cofounder Gabe Newell, reflecting a cross-section of AI, venture, and gaming leaders [1][2][3]. The company is incorporated as a Public Benefit Corporation (PBC), emphasizing long-term research rather than near-term product sales [1][3]. OpenAI’s dual role—as investor and research collaborator—positions it to explore AI foundation models informed by large-scale brain data [1][2][3].

How It Differs from Implant-Based BCIs

Unlike implant-first approaches such as Neuralink’s multi-site electrodes, Merge aims to avoid open-brain surgery with an ultrasound-first stack and molecular tools, prioritizing safety, repeatability, and broader coverage across deep brain regions [1][3][5]. The near-term emphasis on a read-only brain-computer interface could suit AI-driven workflows—translating intent to models—without committing to invasive implantation [1][3][5]. For businesses, this could mean faster experimentation and lower risk profiles compared to implant-dependent systems, if performance and signal fidelity meet practical thresholds [1][3][5].

Regulatory, Clinical, and Commercial Hurdles

Merge enters a U.S. environment where non-invasive neuromodulatory devices face evolving but demanding FDA pathways, implying rigorous clinical trials and multi-stage review before any therapeutic or assistive claims can be marketed [1][3][8]. As a research-first PBC, the company appears structured to endure extended validation cycles and to publish scientific results along the way [1][3]. For general background on device oversight, see the FDA’s device regulatory framework (external).

Business and Product Implications for AI-First Companies

If Merge’s read-only approach can reliably decode intent at scale, it could streamline human–AI interaction in knowledge work, customer support triage, or high-cognitive-load environments where hands-free, latency-sensitive input matters [1][3][5]. OpenAI’s involvement suggests potential co-development of AI models tailored to neural inputs, expanding use cases beyond cursor control toward thought-to-AI queries and context-aware assistants [1][2][3]. For strategy teams, this is an early signal to monitor neurotech integrations as a potential interface shift alongside voice and multimodal AI. To prepare internal teams and workflows for such interfaces, organizations can explore AI tools and playbooks.

Risks, Ethics, and Data Privacy Considerations

Brain data is uniquely sensitive. Even with a read-only orientation, companies will need clear consent frameworks, on-device or privacy-preserving processing where possible, and governance policies that anticipate misuse and secondary inferences from neural signals. Merge’s research posture—and likely extended regulatory path—creates space to develop best practices around data retention, model training, and user autonomy before any broad deployments occur [1][3][8].

What to Watch Next

Key signals include peer-reviewed publications demonstrating robust decoding performance; evidence of safe, repeatable targeting in deeper brain regions; early clinical studies; and regulatory filings that clarify indications and pathways [1][3][7][8]. On the business side, look for updates on OpenAI’s research collaboration, additional strategic investors, and prototypes showing practical value in enterprise workflows [1][2][3]. As evidence matures, expect renewed interest in ultrasound-based brain-computer interfaces as a candidate interface layer for AI-first products [1][3][5][7].

Conclusion

Merge Labs’ ultrasound-first, non-invasive thesis, substantial seed funding, and partnership with OpenAI put it at the forefront of a potential interface shift: thought-to-AI input that could remake productivity and user experience. The road runs through science and regulation, but the strategic upside—natural, high-bandwidth human–machine collaboration—keeps ultrasound-based brain-computer interfaces firmly on the watchlist for leaders in AI and enterprise technology [1][2][3][5][7][8].

Sources

[1] OpenAI Invests in Sam Altman’s New Brain Tech Startup …
https://www.wired.com/story/openai-invests-in-sam-altmans-new-brain-tech-startup-merge-labs/

[2] Merge Labs Raises $252M from OpenAI, Bain, & Gabe …
https://www.sourcery.vc/p/breaking-merge-labs-raises-252m-from

[3] Exclusive: OpenAI and Sam Altman Back A Bold New Take …
https://www.corememory.com/p/exclusive-openai-and-sam-altman-back-merge-labs-bci

[4] Sam Altman launches Merge Labs, a non-invasive BCI startup
https://www.linkedin.com/posts/menastartupdigest_sam-altman-launches-merge-labs-to-challenge-activity-7388591364990652416-sH-4

[5] Altman’s Merge Labs pursues ultrasound brain interface
https://www.rdworldonline.com/altmans-rumored-brain-interface-startup-chases-thought-to-chatgpt-dreams/

[6] Sam Altman & OpenAI Explore Non-Invasive Mind-Reading
https://www.precedenceresearch.com/news/altman-brain-computer-interface

[7] Optimized ultrasound neuromodulation for non-invasive control of …
https://pmc.ncbi.nlm.nih.gov/articles/PMC11709124/

[8] United States Non-Invasive Neuromodulatory Devices Market Size …
https://www.linkedin.com/pulse/united-states-non-invasive-neuromodulatory-dnodes/

Scroll to Top