Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

The New Coffee Room

  1. TNCR
  2. General Discussion
  3. Interesting AI Thought Exercise

Interesting AI Thought Exercise

Scheduled Pinned Locked Moved General Discussion
5 Posts 4 Posters 27 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • taiwan_girlT Online
    taiwan_girlT Online
    taiwan_girl
    wrote last edited by
    #1

    https://github.com/haykgrigo3/TimeCapsuleLLM

    An interesting thing about contemporary artificial intelligence models, specifically large language models (LLMs): They can only output text based on what’s in their training dataset. Models, including ChatGPT and Claude, are “trained” on large databases of text. The models, when asked a question, statistically create a response by calculating, one word at a time, what the most likely next word should be. A consequence of this is that LLMs can’t output text about scientific breakthroughs that have yet to happen, because there’s no existing literature about those breakthroughs. The best an AI could do is repeat predictions written by researchers, or synthesize those predictions.

    Adam Mastroianni, writing in his newsletter Experimental History, put this elegantly: “If you booted up a super-smart AI in ancient Greece, fed it all human knowledge, and asked it how to land on the moon, it would respond, ‘You can’t land on the moon. The moon is a god floating in the sky.'”

    It’s an interesting thought experiment. What if you intentionally limited the training data? Could you create an AI system that responds as though it’s from a period in the past?  What could that reveal about the psychology or everyday experiences of the people from that era?

    That’s exactly what Hayk Grigorian, a student at Muhlenberg College in Allentown, Pennsylvania, had in mind when he created TimeCapsuleLLM. This experimental AI system was trained entirely on texts from 19th century London. The current release is based on 90 gigabytes of text files originally published in the city of London between 1800 and 1875.

    1 Reply Last reply
    • MikM Away
      MikM Away
      Mik
      wrote last edited by
      #2

      That is an interesting thought work.

      "Chatgpt, I have a headache"

      "Your options are bloodletting or leeches"

      "You cannot subsidize irresponsibility and expect people to become more responsible." — Thomas Sowell

      taiwan_girlT 1 Reply Last reply
      • jon-nycJ Offline
        jon-nycJ Offline
        jon-nyc
        wrote last edited by
        #3

        That’s cool. I’ve read mention of that project. Would be fun to try

        The whole reason we call them illegal aliens is because they’re subject to our laws.

        1 Reply Last reply
        • MikM Mik

          That is an interesting thought work.

          "Chatgpt, I have a headache"

          "Your options are bloodletting or leeches"

          taiwan_girlT Online
          taiwan_girlT Online
          taiwan_girl
          wrote last edited by
          #4

          @Mik 555

          1 Reply Last reply
          • 89th8 Offline
            89th8 Offline
            89th
            wrote last edited by
            #5

            Fascinating idea, thanks for sharing!

            1 Reply Last reply
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • Users
            • Groups