Source.

Transcript

Screenshot of a Mastodon post by Kevin Beaumont: “Generative Al government lobbying.”

Photo of AI/tech company CEO’s, captioned:
We spent a Trillion on NVDA GPUs antide dont have any Al product you want.

Photo of a crying male, captioned:
Please like our Al bro This is the last time bro. So many possibilities bro. Its the future bro. Just need you to like it bro. We worked real hard bro. Our stockholders need this one bro.

  • sturger@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    14 hours ago
    Technology         Example   Maybe Useful For   Marketed As                    Actual Use      
    ----------         -------   ----------------   -------------                  -------------
    Voice Recognition  Alexa      Voice Typing      OMG!!! Digital Assistant!!!!!  Hoover up your personal information      
    LLM                ChatGPT    Better Google     OMG!!! Replace people!!!!      Hoover up your personal information      
    Blockchain           -            ?             OMG!!!! Put in everything!!!!! Cryptocurrency         
    NPT                  -           N/A            OMG!!!! NPT!!!!                Fleecing rubes          
    CryptoCurrency     BitCoin    Online $ transfer OMG!!!! Buy! Buy! Buy!         Stealing your real money  
    

    N.B. There has GOT to be a better way to align tables in Lemmy.

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      13 hours ago

      LLMs are definitely not for better Google. They are text generators, not search engines.

      They could be used in conjunction with search engines, like perplexity does. But they can not and will not ever replace them.

      They make tons of mistakes and they can not be fully trusted. So don’t use them as truth or things that require to be faultless.

      LLMs are useful for descriptions (like comments in code), creative input (help writers get out of writers block), summaries (full text can be checked for hallucinations), and corporate emails (since nobody reads that flowerly language anyway).