Close Menu
    What's Hot
    – 20230173bccc501cd5ca1cb6d4e1a55309c444

    Child mental health forum to be held in Norristown

    May 13, 2024
    – 202309651575b3861327.14722519

    GameStop's stock prices surged when Roaring Kitty made a comeback on social media

    May 13, 2024
    – 202303van Jones 2023

    Van Jones criticized possible Trump VP candidates for avoiding 2024 election questions, saying it's like failing kindergarten

    May 13, 2024
    Facebook X (Twitter) Instagram Telegram
    Facebook X (Twitter) Instagram
    The Plaza JournalThe Plaza Journal
    Subscribe
    Saturday, June 28
    • Sports
      • American Football
      • Basketball
      • Baseball
      • Boxing
      • Cricket
      • Football
      • Hockey
      • Tennis
    • Politics
      – 2023102

      John Dean believes the hush money case against Trump is very strong

      May 13, 2024
      – 202405my screenshots 2024 05 13 at 85135am e1715608366191

      “Cruel and unfeeling” Trump assistant boasts about causing innocent homeless people to be arrested

      May 13, 2024
      – 202305stopthestealrally 01062021 getty

      Only 5 percent of people mention January 6th as the most important memory from Trump's presidency: Survey

      May 13, 2024
      – 202405AP24082659643362 e1714510107248

      7 in 10 say they’ve given a lot of thought to election: Gallup

      May 13, 2024
      – 202405AP24133009758539

      Trump: ‘Hannibal Lecter is a wonderful man’

      May 13, 2024
    • Technology
    • United States
    • United Kingdom
    • Business
    • Entertainment
    • Science
      1. Archaeology
      2. Anthropology
      3. Space
      4. Biology
      5. Ecology
      6. Geology
      7. Nanotechnology
      8. Neurology
      9. Paleontology
      10. Psychology
      11. Mathematics
      12. Geography
      13. Astrophysics
      14. Oceanography
      15. Physics
      Featured
      – 20240513unesco hunting cave

      How perspiration and endurance helped humans become excellent runners and hunters

      Biology May 13, 20244 Mins Read
      Recent
      – 20240513unesco hunting cave

      How perspiration and endurance helped humans become excellent runners and hunters

      May 13, 2024
      – 202405112024 aurora photos

      Amazing photos of colorful skies around the world as auroras shine in bright colors

      May 11, 2024
      – 20240510mosquitoes scotland

      Scotland has suddenly seen a large increase in mosquitoes

      May 10, 2024
    • Health
    Facebook X (Twitter) Instagram
    The Plaza JournalThe Plaza Journal
    Home»News

    The drawbacks of analyzing technology policy based on first impressions

    By Myles UlwellingMarch 10, 2024 News 9 Mins Read
    – 202401659e8f11659a79.02213774
    FILE - A ChapGPT logo is seen on a smartphone in West Chester, Pa., Wednesday, Dec. 6, 2023. Catalyzing a year of AI fanfare was ChatGPT. The World Economic Forum says false and misleading information supercharged with cutting-edge artificial intelligence is the top immediate risk to the global economy. In its latest Global Risks Report, the organization says misinformation and disinformation as the most severe risk over the next two years. (AP Photo/Matt Rourke, File)
    Share
    Facebook Twitter LinkedIn Pinterest Email

    In the end of 2022, OpenAI introduced a Transformer-based Large Language Model (LLM) called “ChatGPT.” In opposition to what OpenAI’s staff expected, ChatGPT became the fastest-growing web-based app in history, reaching 100 million active users in two months (second only to Meta’s). The initial public opinions of ChatGPT were both excellent and indicating potential problems. In February 2023, Henry Kissinger, Eric Schmidt, and Daniel Huttenlocher stated that generative artificial intelligence (AI) is similar to the intellectual revolution brought about by the printing press, this time consolidating and ‘distilling’ the storehouse of human knowledge. In March 2023, Eliezer Yudkowsky, foreseeing extinction-level risks, called for the world’s governments and militaries to shut down the AI project and “be willing to destroy a rogue datacenter by airstrike.” These initial opinions represent two extremes, but the reasoning that falls in between them is common in technology policy analysis: personal opinions of generative AI become part of the basic assumptions used in policy analyses. When assumptions of fundamental importance go unquestioned, it is very easy to fall into the trap of predicting future technological advancements based on current technological conditions. Technology policy analysts of all types do great work, but it is time to identify the gaps in our reasoning and aim for higher standards both individually and collectively. An example illustrates the general tendency. The Center for a New American Security’s Paul Scharre, in his book “Four Battlegrounds”—which in general provides a wealth of insights—speculates about the future of AI, leaning towards the idea that “Building larger, more diverse datasets may result in more robust models. Multimodal datasets may help to build models that can associate concepts represented in multiple formats, such as text, images, video, and audio.” This expectation is based on the idea that scaling up AI systems (making their internal capacity and training datasets larger) will lead to new capabilities, with a positive reference to Richard Sutton’s well-known argument in “The Bitter Lesson” about the benefits of such techniques. Not long after, Microsoft’s researchers contributed to a wave of overly optimistic claims about the future of LLMs with their provocatively titled, “Sparks of Artificial General Intelligence” paper on GPT-4. It is not hard to see how one’s personal impression of GPT-4 could lead to a similar sense of “We’re on the brink of something big here.” However, this is not a valid reason to allow the assumptions tied up in this sentiment to go unchecked in one’s analyses. A lot of research emphasizes the limitations of LLMs and other Transformer-based systems. Hallucinations (authoritative but factually incorrect statements) continue to trouble LLMs, with some researchers suggesting that these are simply inherent features of this technology. According to a recent study, voters using chatbots for basic information about the 2024 elections can easily be misinformed about hallucinated polling places and other false or outdated information. Other research shows that LLMs’ ability to form summarizations and lag behind humans in generalizing; the reasoning abilities of multimodal systems have a similar situation. OpenAI’s most recent creation, called “Sora,” can generate videos from text that look real but invents things and people that don't follow real-world physics. This goes against the idea that new modalities like image and video would lead to the reliable, strong, and explainable AI systems we want. Threadsthinking skills None of this says that there is excessive excitement in the technology world. Carnegie’s Matt O’Shaughnessy is right in pointing out that talk of “superintelligence” is likely to have a negative impact on policymaking due to the fundamental limitations of machine learning. And the Biden administration’s extensive executive order on AI in October 2023, while invoking the Defense Production Act to monitor some powerful AI systems, was more about tone than expected. real-life physics implored However, the problem we see here is not just an issue of excessive excitement but it is about getting stuck in analytic frameworks that are easily ignored in favor of quick publications and self-promotion. The disappointment with AI-enabled and autonomous drones on the battlefield in Ukraine should make us question the alleged rapid progress of breakthroughs happening in 2023. When it comes to quantum information science, it seems easier to find nuance, but there is little reflection when it comes to the future of quantum computing.

    At the same time, today's generative AI is beginning to resemble a parody of Mao’s Continuous Revolution. Transforming this technology into human-like “general” intelligence or some other technological marvel always seems to be just one model upgrade away. It must not succumb to challenges from regulatory bodies or popular movements.

    Nevertheless, some technology policy analysts are making choices that favor quick publications and self-promotion over deeper analysis and reflection. The disappointment with AI-enabled drones on the battlefield in Ukraine should give us pause about the alleged rapidity of breakthroughs occurring in 2023.superintelligenceThis suggests that there is an overhype in the technology world. Talk of “superintelligence” is likely to negatively affect policymaking due to the fundamental limitations of machine learning. executive order The Biden administration’s extensive executive order on AI in October 2023, which invoked the Defense Production Act to monitor some powerful AI systems, was more about tone than one might expect. computationally powerful computational strengthAI systemsThe Biden administration’s extensive executive order on AI in October 2023, while invoking the Defense Production Act to monitor certain computational powerful AI systems, was more in tone than one might expect.

    Yet, the issue we identify here is not just about excessive excitement. It’s also about being stuck in analytic frameworks that are easily ignored in favor of quick publications and individual or organizational self-promotion. The disappointment with AI-enabled and autonomous drones on the battlefield in Ukraine should raise concerns about the alleged rapidity of fundamental breakthroughs occurring in 2023.analysis limitationThe frustration from the performance of AI-enabled and autonomous drones on the battlefield in Ukraine should make us reconsider the supposed rapid progress of breakthroughs happening in 2023. Additionally, when it comes to quantum information science, it appears easier to find nuance, but there is little individual or collective reflection when it comes to the future of quantum computing.

    letdown quantum computing However, generative AI today is starting to look like a parody of Mao’s Continuous Revolution—the transformation of this technology into a human-like “general” intelligence or some other marvel of technological imagination is always one model upgrade away, and it cannot be allowed to succumb to challenges from regulatory bodies or popular movements. Overall, today's generative AI is starting to resemble a satire of Mao’s Continuous Revolution. The idea of transforming this technology into human-like “general” intelligence or some other technological wonder always seems to be just one model upgrade away, and it must not be allowed to be defeated by challenges from regulatory bodies or popular movements.The key lesson here is that policy analysts need to make choices when evaluating technology. Choosing certain assumptions over others leads the analyst to a specific set of potential policy options, possibly neglecting other options. to the detriment of others The first impression of new technologies is inevitable and can provide diverse opinions. However, the problem for policy analysis arises when practitioners fail to refine their ideas through rigorous intellectual criticism, and instead neglect other possibilities. Policy analysis combines elements from industry, domestic politics, and international affairs. Identifying a policy challenge is not enough—it must also link societal needs and values with anticipated or actual impacts from developments within its borders or abroad. Our intuition should be the focus of sincere and shared scrutiny. However, policy analysis is a blend of elements from industry, domestic politics, and international affairs. Simply recognizing a policy challenge is not sufficient—it must also establish a connection between societal needs and values and the expected or actual impacts of developments at home or abroad. Our intuition should be the subject of genuine and shared examination. Vincent J. Carchidi.

    is a Non-Resident Scholar at the Strategic Technologies and Cyber Security Program at the Middle East Institute. He is also part of the 2024 Cohort of Foreign Policy for America’s NextGen Initiative.

    Too often, technology policy analysis relies on individual opinions instead of collective critical thinking. The policy field should strive for better standards. only hype in the technology world. Carnegie’s Matt O’Shaughnessy correctly notes that talk of “superintelligence” is likely to negatively influence policymaking because of machine learning’s fundamental limitations. Additionally, the Biden administration’s extensive October 2023 executive order on AI, while dramatically invoking the Defense Production Act to authorize the monitoring of certain computationally powerful AI systems, was more diverse in tone than one might expect.

    Yet, the problem we identify here is not a hype problem per se. Hype is a result of getting stuck in analytic frames that are too easily ignored in favor of quick publications and individual or organizational self-promotion. Lest we mistakenly believe this is just a peculiar LLM-specific tendency, the disappointment of AI-enabled and autonomous drones on the battlefield in Ukraine should raise eyebrows about the alleged rapidity of fundamental breakthroughs occurring in 2023. Moreover, it is easier to find nuance in the domain of quantum information science, but at the same time, little individual or collective reflection appears to arise when its crown jewel of quantum computing begins to see its future downgraded.

    Nevertheless, generative AI today is starting to look like a parody of Mao’s Continuous Revolution—the transformation of this technology into a human-like “general” intelligence or some other marvel of technological imagination is always one model upgrade away, and it cannot be allowed to succumb to challenges from regulatory bodies or popular movements.

    The takeaway is that policy analysts make choices when assessing technology. The choice of certain assumptions over others presents the analyst with a certain set of possible policy options at the expense of others. That individuals have first impressions about new technologies is unavoidable and can serve as a source of diversity of opinion. The problem for policy analysis arises when practitioners fail to pour their first (or second, or third, etc.) impressions into a shared crucible that exposes unstable ideas to high-temperature intellectual criticism, thereby guiding them toward an articulation of specific policy challenges and solutions without unduly neglecting other possibilities wholesale.

    Policy analysis generally is a concoction of ingredients from industry, domestic politics, and international affairs. Merely identifying that a policy challenge exists is not done de novo but from an intuitive link between the needs and values of a society and anticipated or actual impacts by developments within its borders or abroad. That intuition—we all have it—should be the focus of our honest and shared scrutiny.

    Vincent J. Carchidi is a Non-Resident Scholar at the Middle East Institute’s Strategic Technologies and Cyber Security Program. He is also a member of Foreign Policy for America’s NextGen Initiative 2024 Cohort.

    Myles Ulwelling

    Keep Reading

    – 20230173bccc501cd5ca1cb6d4e1a55309c444

    Child mental health forum to be held in Norristown

    – 2024058 1

    Deciphering Thaksin’s invite to Myanmar’s ethnic groups

    – 202307AP081203023809 e1690573674664

    Record travel anticipated this Memorial Day weekend

    – 202405rafah gaza 051024 AP

    Israel advances further into Rafah

    Add A Comment
    Leave A Reply Cancel Reply

    Must Read
    Latest Posts
    – 20230173bccc501cd5ca1cb6d4e1a55309c444

    Child mental health forum to be held in Norristown

    May 13, 2024
    – 202309651575b3861327.14722519

    GameStop's stock prices surged when Roaring Kitty made a comeback on social media

    May 13, 2024
    – 202303van Jones 2023

    Van Jones criticized possible Trump VP candidates for avoiding 2024 election questions, saying it's like failing kindergarten

    May 13, 2024
    – 20240513Depositphotos 241148346 L

    Chicken fat supercapacitors may be able to store future green energy

    May 13, 2024
    – 202405antisemitism

    Most students at prestigious universities believe that there is an issue with antisemitism, as per a survey conducted by U.S. News & World Report

    May 13, 2024
    The Plaza Journal White Logo
    X-twitter Facebook Google Pinterest Telegram

    News

    • World
    • US Politics
    • EU Politics
    • Business
    • Opinions
    • Connections
    • Science

    Company

    • Information
    • Advertising
    • Classified Ads
    • Contact Info
    • Do Not Sell Data
    • GDPR Policy
    • Media Kits

    The Plaza Journal

    • Contact Us
    • Subscription
    • Submit an Anonymous Tip
    • Newsletters
    • Sponsored News
    • Advertise With Us
    • Privacy Notice

    Keep updated

    Get the latest creative news from FooBar about art, design and business.

    Copyright © 2025 The Plaza Journal. All rights reserved.
    • Privacy Policy
    • Terms of Use
    • Cookie Policy
    • Accessibility

    Type above and press Enter to search. Press Esc to cancel.