Code your AI to analyze a Tech report w/ multiple pre-trained transformer models (SBERT 2)
Автор: Discover AI
Загружено: 2021-03-25
Просмотров: 234
Описание:
Apply different pretrained transformer models for sentence embeddings of a given document. Empirical data from experimental set-ups of coding your own AI system, on a home PC with JupyterLab /Python.
BERT (https://ai.googleblog.com/2018/11/ope...)
and
RoBERTa ( / roberta-an-optimized-method-for-pretrainin... )
in our example.
Understand the different perspectives an AI will discover, given differently trained transformer models from sentence-transformers (https://www.sbert.net/).
#pythonai
#sbert
#nlproc
#nlptechniques
#clustering
#semantic
#bert
#climatechange
#3danimation
#3dvisualization
#topologicalspace
#deeplearning
#machinelearningwithpython
#pytorch
#sentence
#embedding
#complex
#ipcc
#umap
#insight
#pooling_layer_architecture
#sentence_transformer
#sentencetransformer_networks
#umap
#algebraic_topology
#SentenceTransformer
#bert
#sentence_embedding
#umap
#networkx
by the way: sentence-transformer now available in v1.0 (https://github.com/UKPLab/sentence-tr...)
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: