Alignment Research Center logo

Donate to the ARC Theory project

The Theory team is led by Paul Christiano, furthering the research initially laid out in our report on Eliciting Latent Knowledge. At a high level, we’re trying to figure out how to train ML systems to answer questions by straightforwardly “translating” their beliefs into natural language rather than by reasoning about what a human wants to hear. We expect to use funds donated via this fundraiser towards supporting the Theory project.

However, all donations via every.org are unrestricted. If you are planning to give more than $10,000 and would like to place a restriction on your donation, reach out to donations@alignment.org.

Donate to the ARC Theory project

The Theory team is led by Paul Christiano, furthering the research initially laid out in our report on Eliciting Latent Knowledge. At a high level, we’re trying to figure out how to train ML systems to answer questions by straightforwardly “translating” their beliefs into natural language rather than by reasoning about what a human wants to hear. We expect to use funds donated via this fundraiser towards supporting the Theory project.

However, all donations via every.org are unrestricted. If you are planning to give more than $10,000 and would like to place a restriction on your donation, reach out to donations@alignment.org.

A 501(c)(3) nonprofit, EIN 86-3605182

ARC is a non-profit research organization whose mission is to align future machine learning systems with human interests.

alignment.org

Supporters

Become a supporter!

Donate

Become a supporter!

Donate
Donate
A 501(c)(3) nonprofit, EIN 86-3605182

ARC is a non-profit research organization whose mission is to align future machine learning systems with human interests.

Donate