{"id":19082,"date":"2023-06-06T07:14:27","date_gmt":"2023-06-06T07:14:27","guid":{"rendered":"https:\/\/nftandcrypto-news.com\/crypto\/ai-should-be-regulated-like-medicine-and-nuclear-power-uk-minister\/"},"modified":"2023-06-06T07:14:29","modified_gmt":"2023-06-06T07:14:29","slug":"ai-should-be-regulated-like-medicine-and-nuclear-power-uk-minister","status":"publish","type":"post","link":"https:\/\/nftandcrypto-news.com\/crypto\/ai-should-be-regulated-like-medicine-and-nuclear-power-uk-minister\/","title":{"rendered":"AI should be regulated like medicine and nuclear power: UK minister"},"content":{"rendered":"

<\/p>\n

\n

Developers working on artificial intelligence should be licensed and regulated similarly to the pharmaceutical, medical, or nuclear industries, according to a representative for Britain\u2019s opposing political party.<\/p>\n

Lucy Powell, a politician and digital spokesperson for the United Kingdom\u2019s Labour Party told The Guardian on June 5 that firms like OpenAI or Google which have created AI models should \u201chave to have a license in order to build these models,\u201d adding:<\/p>\n

\u201cMy real point of concern is the lack of any regulation of the large language models that can then be applied across a range of AI tools, whether that\u2019s governing how they are built, how they are managed or how they are controlled.\u201d<\/p><\/blockquote>\n

Powell argued regulating the development of certain technologies is a better option than banning them similar to how the European Union banned facial recognition tools.<\/p>\n

She added AI \u201ccan have a lot of unintended consequences\u201d but if developers were forced to be open about their AI training models and datasets then some risks could be mitigated by the government<\/p>\n

\u201cThis technology is moving so fast that it needs an active, interventionist government approach, rather than a laissez-faire one,\u201d she said. <\/p>\n

\n

Ahead of speaking at the TechUk conference tomorrow, I spoke to the Guardian about Labour\u2019s approach to digital tech and AI https:\/\/t.co\/qzypKE5uJU<\/p>\n

\u2014 Lucy Powell MP (@LucyMPowell) June 5, 2023<\/a><\/p><\/blockquote>\n

Powell also believes such advanced technology could greatly impact the U.K. economy and the Labour Party is purportedly finishing up its own policies on AI and related technologies. <\/p>\n

Next week, Labour leader Keir Starmer is planning to hold a meeting with the party\u2019s shadow cabinet at Google\u2019s U.K. offices so it can speak with its AI-focused executives.<\/p>\n

Related: <\/em><\/strong>EU officials want all AI-generated content to be labeled<\/em><\/strong><\/p>\n

Meanwhile on June 5, Matt Clifford the chair of the Advanced Research and Invention Agency \u2014 the government\u2019s research agency set up last February \u2014 appeared on TalkTV to warn AI could threaten humans in as little as two years.<\/p>\n

\n

EXCLUSIVE: The PM\u2019s AI Task Force adviser Matt Clifford says the world may only have two years left to tame Artificial Intelligence before computers become too powerful for humans to control.<\/p>\n

\u2014 TalkTV (@TalkTV) June 5, 2023<\/a><\/p><\/blockquote>\n

\u201cIf we don\u2019t start to think about now how to regulate and think about safety, then in two years’ time we\u2019ll be finding that we have systems that are very powerful indeed,\u201d he said. Clifford clarified, however, that a two-year timeline is the \u201cbullish end of the spectrum.\u201d<\/p>\n

Clifford highlighted that AI tools today could be used to help \u201claunch large-scale cyber attacks.\u201d OpenAI has put forward $1 million to support AI-aided cybersecurity tech to thwart such uses.<\/p>\n

\u201cI think there\u2019s [sic] lots of different scenarios to worry about,\u201d he said. \u201cI certainly think it\u2019s right that it should be very high on the policymakers\u2019 agendas.\u201d<\/p>\n

BitCulture: <\/em><\/strong>Fine art on Solana, AI music, podcast + book reviews<\/em><\/strong><\/p>\n<\/div>\n