In October, President Joe Biden issued an executive order aimed at regulating and expanding artificial intelligence (AI) in the United States. It’s a tricky area as this technology continues to expand across the globe and take over common human tasks. Lawmakers are concerned about how AI innovations are being monitored while trying to make sure America stays ahead in this new race for dominance. While it’s important to seek recommendations from a variety of experts in the field, there’s heavy criticism that the administration is relying predominately on one tech company with former ties to the White House.
RAND’s Role in Biden’s AI Order
The RAND corporation is being accused of having too much influence in the crafting of the president’s executive order, with some of its recommendations mirrored in the EO. An AI researcher told Politico that by “serving as the initial recommender for key provisions in the AI executive order – rather than simply helping the Biden administration draft and implement its own priorities,” RAND had gone beyond being a “technical assistance operation” to an “influence operation.”
Part of the controversy lies in how the think tank set out reporting requirements on AI systems that reportedly nearly duplicate the privacy priorities of Open Philanthropy (OP), an organization that has already given RAND $15 million this year. More than that, OP is financed by billionaire Facebook co-founder Dustin Moskovitz and his wife Cari Tuna. Another concern is that the organization funds causes associated with “effective altruism,” an ideology made famous by FTX founder Sam Bankman-Fried.
Why is this an issue? Effective altruism emphasizes a data-driven method of philanthropy, focusing on artificial intelligence’s apocalyptic potential, which includes the concern that AI technology could be used to develop bioweapons. Critics fear this concentration serves the interests of only tech companies while distracting lawmakers from current AI threats, including the promotion of racial bias. Already, many of tech companies’ top personnel subscribe to this ideology. Because RAND heavily influences the White House in AI and effective altruist ideas, critics worry US policies are being created by powerful billionaire tech companies that might not have Americans’ well-being and safety as their main goal.
The RAND, White House, and Open Philanthropy Connection
Besides the $15 million OP gave to RAND, other connections are cause for concern, according to those who oppose RAND’s heavy involvement in Biden’s executive order. RAND’s CEO Jason Matheny and Senior Information Scientist Jeff Alstott are not only well-known effective altruists but also worked in the Biden White House Office of Science and Technology Policy and the National Security Council before joining the company last year.
OP has ties to AI firms Anthropic and OpenAl, and Matheny is one of five members serving on Anthropic’s Long-Term Benefit and Trust. OP finances the Horizon Institute for Public Service, an organization that places staffers across Washinton to work on assessing risks and other policies related to AI and biotechnologies. Two AI fellows funded by this organization work at RAND. “Those fellows are part of a broader network, financed by Open Philanthropy and other tech-linked groups, that is funding AI staffers in Congress, at federal agencies and across key think tanks in Washington,” Politico explained.
During an oversight committee hearing on cybersecurity, information technology, and government innovation earlier this month, several witnesses testified on the need to step up efforts to increase AI technology. They pointed out such positive aspects as new cybersecurity directives for the State Department, Homeland Security, and other federal departments and the initiative to bring in more migrants with advanced science and technology skills. However, some believe the government is exerting too much control and that private entities should be spared red tape so that the United States does not continue to fall behind China and other worldwide competitors.