• 0 Posts
  • 84 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2024

help-circle





  • But yes. Exactly in the use of “Artificial Intelligence”.

    Artificial Intelligence is a wide field, consisting of a plethora of methods. LLMs like ChatGPT are part of this wide field, as per definition how researchers are describing the field.

    The “intelligence” part is an issue though if taken literal, since we have no clear definition of what “intelligence” even is. Neither for human / natural intelligence, nor for artificial. But that’s how the field was labled. We have created a category for a bunch of methods, models and algorithms and sticked “AI” onto it. Therefore I stand by what I have said before:

    It is AI.

    Due to the lack of a clear definition for “intelligence” I would coarsely outline AI as: mimicking natural thinking, problem solving and decision processes without necessarily being identical. (This makes it difficult to distinguish it from plain calculators though, so a better definition is required.) So if we have a model that is able to distinguish cat pictures from non-cat pictures, that’s AI. And if we have “autocorrect on steroids” (credit to Dirk Hohndel) like ChatGPT, that matches the text comprehension skills of 15 year olds (just an example), then this too is AI.









  • For dipping your toes into a new topic I think it’s perfectly fine. It helps to provide useful pointers for further “research” (in a sense that would meet your requirement) and also manages to provide mostly accurate overviews. But sure, to really dive into this, LLMs like ChatGPT and co. are just some low-level assistants at best and one should go through the material themselves.