{"id":234338,"date":"2026-05-05T23:05:27","date_gmt":"2026-05-05T22:05:27","guid":{"rendered":"https:\/\/neuscorp.com\/index.php\/2026\/05\/05\/234338\/"},"modified":"2026-05-05T23:05:27","modified_gmt":"2026-05-05T22:05:27","slug":"234338","status":"publish","type":"post","link":"https:\/\/neuscorp.com\/index.php\/2026\/05\/05\/234338\/","title":{"rendered":""},"content":{"rendered":"<p><a href=\"https:\/\/www.theguardian.com\/technology\/2026\/may\/05\/commerce-department-ai-agreements-google-microsoft-xai\">Source link <\/a><\/p>\n<div>\n<p class=\"dcr-130mj7b\">The US government has struck deals with Google DeepMind, Microsoft and xAI to review early versions of their new AI models before they are released to the public.<\/p>\n<p class=\"dcr-130mj7b\">The Center for AI Standards and Innovation (CAISI), part of the US Department of Commerce, <a href=\"https:\/\/www.nist.gov\/news-events\/news\/2026\/05\/caisi-signs-agreements-regarding-frontier-ai-national-security-testing\" data-link-name=\"in body link\">announced<\/a> the agreements on Tuesday, saying the review process would be key to understanding the capabilities of new and powerful AI models as well as to protecting US national security. These collaborations will help the federal government \u201cscale (its) work in the public interest at a critical moment\u201d, the agency said in a press release.<\/p>\n<p class=\"dcr-130mj7b\">\u201cIndependent, rigorous measurement science is essential to understanding frontier AI and its national security implications,\u201d said Chris Fall, CAISI director.<\/p>\n<figure id=\"5e9e4a45-8f30-410a-bd50-1f56eee5d0e5\" data-spacefinder-role=\"richLink\" data-spacefinder-type=\"model.dotcomrendering.pageElements.RichLinkBlockElement\" class=\"dcr-47fhrn\"><gu-island name=\"RichLinkComponent\" priority=\"feature\" deferuntil=\"idle\" props=\"&quot;richLinkIndex&quot;:3,&quot;element&quot;:&quot;_type&quot;:&quot;model.dotcomrendering.pageElements.RichLinkBlockElement&quot;,&quot;prefix&quot;:&quot;Related: &quot;,&quot;text&quot;:&quot;Google DeepMind workers in UK vote to unionize amid deal with US military&quot;,&quot;elementId&quot;:&quot;5e9e4a45-8f30-410a-bd50-1f56eee5d0e5&quot;,&quot;role&quot;:&quot;richLink&quot;,&quot;url&quot;:&quot;https:\/\/www.theguardian.com\/us-news\/2026\/may\/04\/google-deepmind-uk-workers-union&quot;,&quot;ajaxUrl&quot;:&quot;https:\/\/api.nextgen.guardianapps.co.uk&quot;,&quot;format&quot;:&quot;design&quot;:0,&quot;display&quot;:0,&quot;theme&quot;:0\"\/><\/figure>\n<p class=\"dcr-130mj7b\"><a href=\"https:\/\/www.nist.gov\/caisi\" data-link-name=\"in body link\">CAISI<\/a> is an agency meant to facilitate collaboration between the tech industry and the federal government in developing standards and assessing risks for commercial AI systems. The agreement between the agency and the AI firms is focused largely on identifying national security risks tied to cybersecurity, biosecurity and chemical weapons.<\/p>\n<p class=\"dcr-130mj7b\">OpenAI and Anthropic <a href=\"https:\/\/www.nist.gov\/news-events\/news\/2024\/08\/us-ai-safety-institute-signs-agreements-regarding-ai-safety-research\" data-link-name=\"in body link\">inked<\/a> similar deals with the Biden administration two years ago and CAISI notes the agency has already completed more than 40 such evaluations, including on unreleased models. It is common for developers to share unreleased AI models with the government that have reduced or removed safety guardrails, CAISI said in its press release. This helps the government \u201cthoroughly evaluate national security-related capabilities and risks\u201d, the agency noted.<\/p>\n<p class=\"dcr-130mj7b\">The new agreements come as fears grow that the newest and most powerful AI models \u2013 such as Anthropic\u2019s Mythos \u2013 could be dangerous to release to the public; AI safety experts, government officials and tech companies fear the expansive capabilities of these models could help hackers exploit cybersecurity vulnerabilities at an unprecedented scale. Anthropic limited its rollout of Mythos to a few companies, and initiated the collaborative <a href=\"https:\/\/www.anthropic.com\/glasswing\" data-link-name=\"in body link\">Project Glasswing<\/a> to bring together tech companies \u201cto secure the world\u2019s most critical software\u201d.<\/p>\n<p class=\"dcr-130mj7b\"><a href=\"https:\/\/www.nytimes.com\/2026\/05\/04\/technology\/trump-ai-models.html\" data-link-name=\"in body link\">The New York Times<\/a> and <a href=\"https:\/\/www.google.com\/search?q=wsj+white+house+executive+order+ai&amp;sca_esv=b7b28de9b2affd5c&amp;biw=730&amp;bih=709&amp;sxsrf=ANbL-n5cyL0Bxy9c51t81jUjPOD8UlOnpQ%3A1777996260425&amp;ei=5BH6adXSGaOV5OMPzuO82Ak&amp;ved=0ahUKEwjV1I6cwKKUAxWjCnkGHc4xD5sQ4dUDCBM&amp;uact=5&amp;oq=wsj+white+house+executive+order+ai&amp;gs_lp=Egxnd3Mtd2l6LXNlcnAiIndzaiB3aGl0ZSBob3VzZSBleGVjdXRpdmUgb3JkZXIgYWkyBRAhGKABMgUQIRigATIFECEYqwIyBRAhGKsCSJkbUABYrRpwAHgAkAEAmAF0oAGsE6oBBDI4LjK4AQPIAQD4AQGYAh6gAvETwgIREAAYgAQYigUYkQIYsQMYgwHCAhAQABiABBiKBRhDGLEDGIMBwgINEAAYgAQYigUYQxixA8ICChAAGIAEGIoFGEPCAg4QABiABBiKBRixAxiDAcICCxAAGIAEGIoFGJECwgIFEAAYgATCAgYQABgWGB7CAggQABgWGB4YCsICCxAAGIAEGIoFGIYDwgIIEAAYgAQYogTCAgUQABjvBcICBRAhGJ8FmAMAkgcEMjcuM6AHkX2yBwQyNy4zuAfxE8IHBDcuMjPIByqACAE&amp;sclient=gws-wiz-serp\" data-link-name=\"in body link\">Wall Street Journal<\/a> reported Monday the Trump administration was mulling over a potential executive order to create a government oversight process for these AI tools; the administration has characterized this reporting as \u201cspeculation\u201d.<\/p>\n<p class=\"dcr-130mj7b\">Google and<strong> <\/strong>xAI did not immediately respond to a request for comment.<\/p>\n<figure data-spacefinder-role=\"inline\" data-spacefinder-type=\"model.dotcomrendering.pageElements.NewsletterSignupBlockElement\" class=\"dcr-173mewl\"><gu-island name=\"EmailSignUpWrapper\" priority=\"feature\" deferuntil=\"visible\" props=\"&quot;index&quot;:9,&quot;listId&quot;:6013,&quot;identityName&quot;:&quot;tech-scape&quot;,&quot;category&quot;:&quot;article-based&quot;,&quot;description&quot;:&quot;A weekly dive in to how technology is shaping our lives&quot;,&quot;name&quot;:&quot;TechScape&quot;,&quot;frequency&quot;:&quot;Weekly&quot;,&quot;successDescription&quot;:&quot;We'll send you TechScape every week&quot;,&quot;theme&quot;:&quot;news&quot;,&quot;illustrationSquare&quot;:&quot;https:\/\/media.guim.co.uk\/4933072e410c5b18df40be084aec6ac932ac8c99\/0_0_12501_12500\/12501.jpg&quot;,&quot;idApiUrl&quot;:&quot;https:\/\/idapi.theguardian.com&quot;,&quot;hideNewsletterSignupComponentForSubscribers&quot;:true,&quot;showNewNewsletterSignupCard&quot;:true\"\/><\/figure>\n<p class=\"dcr-130mj7b\">Microsoft announced a similar agreement in the UK on Tuesday with the government-backed AI Security Institute, which also focuses on safe AI development.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWhile Microsoft regularly undertakes many types of AI testing on its own, testing for national security and large-scale public safety risks necessarily must be a collaborative endeavor with governments,\u201d Microsoft wrote in a <a href=\"https:\/\/blogs.microsoft.com\/on-the-issues\/2026\/05\/05\/advancing-ai-evaluation-with-the-center-for-ai-standards-us-and-innovation-and-the-ai-security-institute-uk\/\" data-link-name=\"in body link\">blog post<\/a> about the two deals.<\/p>\n<\/div>\n<p>(The following story may or may not have been edited by NEUSCORP.COM and was generated automatically from a Syndicated Feed. NEUSCORP.COM also bears no responsibility or liability for the content.)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Source link The US government has struck deals with Google DeepMind, Microsoft and xAI to review early versions of their new AI models before they are released to the public. The Center for AI Standards and Innovation (CAISI), part of the US Department of Commerce, announced the agreements on Tuesday, saying the review process would &hellip;<\/p>\n","protected":false},"author":2,"featured_media":234339,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[37],"tags":[],"class_list":["post-234338","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-economy"],"_links":{"self":[{"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/posts\/234338"}],"collection":[{"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/comments?post=234338"}],"version-history":[{"count":0,"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/posts\/234338\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/media\/234339"}],"wp:attachment":[{"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/media?parent=234338"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/categories?post=234338"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neuscorp.com\/index.php\/wp-json\/wp\/v2\/tags?post=234338"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}