OpenAI API - Embeddings
Автор: How Dev You
Загружено: 2024-07-03
Просмотров: 118
Описание:
Embeddings and their various applications (search, recommendation, clustering, etc.)
github repo : https://github.com/howdevyou/openai-apis
Learn how these multi-dimensional vector representations of words drive crucial applications like recommendation systems, search functions, clustering, anomaly detection, and classification.
Timestamp -
00:00 Introduction to embeddings and their applications
03:12 Tokenization: Separating words and converting to vectors
04:26 Understanding vectors: Representation in 2D and 3D
06:04 Word embeddings and their creation process
07:46 Scoring system for relationship determination in embeddings
09:20 Example: Converting "hello" into an embedding list
11:21 Creating embeddings using the Open AI API
12:24 Response from API: Embeddings for each word
13:23 Explaining cosine similarity and its significance
14:19 Comparing vectors: "king" and "cloud" similarity
15:54 Using embeddings to solve clustering problems
16:56 Clustering vehicles using k-means method
We'll walk you through how LinkedIn Learning's AI assistant uses embeddings to understand user queries and recommend relevant topics. Discover the magic behind embeddings by visualizing a dictionary of 10,000 words, each mapped to an embedding vector that reveals their relationships through proximity.
Dive into the inner workings of ChatGPT, where tokens are converted into embedding vectors to grasp context and meaning. We break down embedding vectors to show how each value reflects a word's connection to various attributes, using a hypothetical embedding matrix as an example.
Follow along as we demonstrate converting the word "hello" into an embedding list using the OpenAI API. Get step-by-step instructions on authentification, creating an instance, and generating an embedding vector.
Learn about cosine similarity and see how to calculate the similarity between different words, like "king" and "queen," to better understand their relationships.
Finally, watch how embedding vectors for a list of words can be applied in clustering problems using the K-means algorithm. We’ll group words like "king," "queen," "apple," "orange," "bus," "bicycle," and "motorcycle" into meaningful clusters such as fruits, vehicles, and people.
Whether you're a beginner or an AI enthusiast, this video is packed with insights that will elevate your understanding of embeddings and their practical use cases.
00:00 Introduction to embeddings and their applications
02:30 Tokenization: Separating words and converting to vectors
04:26 Understanding vectors: Representation in 2D and 3D
06:04 Word embeddings and their creation process
07:46 Scoring system for relationship determination in embeddings
09:20 Example: Converting "hello" into an embedding list
11:21 Creating embeddings using the Open AI API
12:24 Response from API: Embeddings for each word
13:23 Explaining cosine similarity and its significance
14:19 Comparing vectors: "king" and "cloud" similarity
15:54 Using embeddings to solve clustering problems
16:56 Clustering vehicles using k-means method
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: