Home  >  Article  >  Technology peripherals  >  The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

WBOY
WBOYforward
2023-05-21 21:34:041172browse

In 2017, the Google team proposed the groundbreaking NLP architecture Transformer in the paper "Attention Is All You Need" and has been cheating ever since.

Over the years, this architecture has been popular with large technology companies such as Microsoft, Google, and Meta. Even ChatGPT, which has swept the world, was developed based on Transformer.

And just today, Transformer’s star rating on GitHub exceeded 100,000!

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

Hugging Face, which started out as a chatbot program, rose to fame as the centerpiece of the Transformer model, becoming The world-famous open source community.

To celebrate this milestone, Hugging Face also summarized 100 projects based on the Transformer architecture.

Transformer detonated the machine learning circle

In June 2017, when Google released the "Attention Is All You Need" paper, perhaps no one thought of this deep learning architecture Transformer How many surprises can it bring.

Since its birth, Transformer has become the cornerstone king in the AI ​​field. In 2019, Google also applied for a patent specifically for it.

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

As Transformer occupies a mainstream position in the NLP field, it has also begun to cross-border into other fields, and more and more work has begun. Try to steer it into the CV realm.

Many netizens were very excited to see Transformer break through this milestone.

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

"I have been a contributor to many popular open source projects, but seeing Transformer reach 10 on GitHub Ten thousand stars, it’s still very special!”

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

Some time ago, the number of GitHub stars of Auto-GPT exceeded that of pytorch. caused a big stir.

Netizens can’t help but wonder how Auto-GPT compares with Transformer?

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

In fact, Auto-GPT far surpasses Transformer and already has 130,000 stars.

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

Currently, Tensorflow has more than 170,000 stars. It can be seen that Transformer is the third machine learning library with a star rating of over 100,000 after these two projects.

Some netizens recalled that when they first used the Transformers library, it was called "pytorch-pretrained-BERT".

50 Awesome Projects Based on Transformers

Transformers is not only a toolkit that uses pre-trained models, it is also a project built around Transformers and Hugging Face Hub Community.

In the following list, Hugging Face summarizes 100 amazing and novel projects based on Transformer.

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

Below, we have selected the first 50 projects for introduction:

gpt4all

gpt4all is an open source chatbot ecosystem. It is trained on a large collection of clean assistant data, including code, stories, and conversations. It provides open source large-scale language models, such as LLaMA and GPT-J, for training in an assistant manner.

Keywords: open source, LLaMa, GPT-J, instructions, assistant

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

## recommenders

This repository contains examples and best practices for building recommender systems, provided in the form of Jupiter notebooks. It covers several aspects needed to build an effective recommendation system: data preparation, modeling, evaluation, model selection and optimization, and operationalization.

Keywords: recommendation system, AzureML

lama-cleaner

Image repair tool based on Stable Diffusion technology. You can erase any unwanted objects, defects, or even people from the image and replace anything on the image.

Keywords: patch, SD, Stable Diffusion

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##flair

FLAIR is a powerful PyTorch natural language processing framework that can transform several important tasks: NER, sentiment analysis, part-of-speech tagging, text and dual embeddings, etc.

Keywords: NLP, text embedding, document embedding, biomedicine, NER, PoS, sentiment analysis

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone##mindsdb

MindsDB is a low-code machine learning platform. It automatically integrates several ML frameworks into the data stack as "AI tables" to simplify the integration of AI in applications and make them accessible to developers of all skill levels.

Keywords: database, low code, AI table

langchain

Langchain is designed to assist in the development of compatible LLM and others Knowledge source applications. The library allows chaining calls to applications, creating a sequence in many tools.

Keywords: LLM, large language model, agent, chain

ParlAI

ParlAI is a platform for sharing , a Python framework for training and testing dialogue models, from open-domain chat, to task-oriented dialogue, to visual question answering. It provides over 100 datasets, many pre-trained models, a set of agents, and several integrations under the same API.

Keywords: dialogue, chatbot, VQA, data set, agent

sentence-transformers

This framework Provides a simple way to compute dense vector representations of sentences, paragraphs and images. These models are based on Transformer-based networks such as BERT/RoBERTa/XLM-RoBERTa and have achieved SOTA in various tasks. Text is embedded in a vector space such that similar texts are close and can be found efficiently through cosine similarity.

Keywords: dense vector representation, text embedding, sentence embedding

ludwig

Ludwig is a declarative machine Learning Framework, which makes it easy to define machine learning pipelines using a simple and flexible data-driven configuration system. Ludwig targets various AI tasks and provides a data-driven configuration system, training, prediction and evaluation scripts, and a programming API.

Keywords: declarative, data-driven, ML framework

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

InvokeAI

InvokeAI is an engine for the Stable Diffusion model, aimed at professionals, artists and enthusiasts. It leverages the latest AI-driven technology through CLI as well as WebUI.

Keywords: Stable Diffusion, WebUI, CLI

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

PaddleNLP

PaddleNLP is an easy-to-use and powerful NLP library, especially for the Chinese language. It supports multiple pre-trained model zoos and supports a wide range of NLP tasks from research to industrial applications.

Keywords: Natural Language Processing, Chinese, Research, Industry

stanza

Official Python of the Stanford University NLP Group NLP library. It supports running a wide range of precise natural language processing tools in more than 60 languages, and supports access to Java Stanford CoreNLP software from Python.

Keywords: NLP, multi-language, CoreNLP

DeepPavlov

DeepPavlov is an open source conversational artificial intelligence library . It is designed for the development of production-ready chatbots, and complex dialogue systems, as well as research in the field of NLP, specifically dialogue systems.

Keywords: dialogue, chatbot

alpaca-lora

Alpaca-lora includes the use of low-rank adaptation ( LoRA) code to reproduce Stanford Alpaca results. This repository provides training (fine-tuning) and generation scripts.

Keywords: LoRA, efficient fine-tuning of parameters

imagen-pytorch

An open source implementation of Imagen, Google’s closed Source text-to-image neural network beats DALL-E2. imagen-pytorch is the new SOTA for text-to-image synthesis.

Keywords: Imagen, Wenshengtu

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##adapter-transformers

adapter-transformers is an extension of the Transformers library that integrates adapters into state-of-the-art language models by incorporating AdapterHub, a central repository of pre-trained adapter modules. It is a drop-in replacement for Transformers and is updated regularly to keep pace with Transformers developments.

Keywords: Adapter, LoRA, parameter efficient fine-tuning, Hub

NeMo

NVIDIA NeMo is designed for automatic speech A conversational AI toolkit built by researchers in recognition (ASR), text-to-speech synthesis (TTS), large language models, and natural language processing. The main goal of NeMo is to help researchers from industry and academia repurpose previous work (code and pre-trained models) and make it easier to create new projects.

Keywords: dialogue, ASR, TTS, LLM, NLP

Runhouse

Runhouse allows you to combine code with Python Send data to any computer or data underlying and continue to interact with them normally from existing code and environments. Runhouse developers mentioned:

#You can think of it as an extension package for the Python interpreter, which can bypass remote machines or operate remote data.

Keywords: MLOps, infrastructure, data storage, modeling

MONAI

MONAI is part of the PyTorch ecosystem and is an open source framework based on PyTorch for deep learning in the field of medical imaging. Its objectives are:

- To develop a collaborative community of academic, industrial and clinical researchers on a common basis;

- To contribute to the medical Imaging creates SOTA, end-to-end training workflow;

- Provides an optimized and standardized method for the establishment and evaluation of deep learning models.

Keywords: medical imaging, training, evaluation

simpletransformers

Simple Transformers allows you to quickly train and evaluate Transformer models . Only 3 lines of code are needed to initialize, train and evaluate the model. It supports a wide variety of NLP tasks.

Keywords: framework, simplicity, NLP

JARVIS

JARVIS is a GPT-4, etc. The LLM system merges with other models from the open source machine learning community, leveraging up to 60 downstream models to perform tasks identified by LLM.

Keywords: LLM, agent, HF Hub

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

transformers.js

transformers.js is a JavaScript library that aims to run models from transformers directly in the browser.

Keywords: Transformers, JavaScript, browser

bumblebee

Bumblebee provides pre-trained on top of Axon Neural network model, Axon is a neural network library for the Elixir language. It includes integration with models, allowing anyone to download and perform machine learning tasks with just a few lines of code.

Keywords: Elixir, Axon

argilla

Argilla is a tool that provides advanced NLP labeling, monitoring and workspaces Open source platform. It is compatible with many open source ecosystems such as Hugging Face, Stanza, FLAIR, etc.

Keywords: NLP, labeling, monitoring, workspace

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

haystack

Haystack is an open source NLP framework that can interact with data using Transformer models and LLM. It provides production-ready tools for quickly building complex decision making, question answering, semantic search, text generation applications, and more.

Keywords: NLP, Framework, LLM

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##spaCy

SpaCy is a library for advanced natural language processing in Python and Cython. It is built on the latest research and designed from the ground up for use in real products. It provides support for the Transformers model through its third-party package spacy-transformers.

Keywords: NLP, architecture

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

speechbrain

SpeechBrain is an open source, integrated conversational AI toolkit based on PyTorch. Our goal is to create a single, flexible, user-friendly toolkit that can be used to easily develop state-of-the-art speech technologies, including speech recognition, speaker identification, speech enhancement, speech separation, speech recognition, multi-microphone signal processing and other systems.

Keywords: dialogue, speech

skorch

Skorch is a wrapper for PyTorch with scikit-learn compatibility Neural network library. It supports models in Transformers, as well as tokenizers from tokenizers.

Keywords: Scikit-Learning, PyTorch

bertviz

BertViz is an interactive tool used in applications such as Visualize attention in Transformer language models like BERT, GPT2, or T5. It can be run in Jupiter or Colab notebooks via a simple Python API that supports most Huggingface models.

Keywords: Visualization, Transformers

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##mesh-transformer-jax

mesh-transformer-jax is a Haiku library that implements Transformers model parallelism using xmap/pjit operators in JAX.

This library is designed to scale to approximately 40B parameters on TPUv3. It is a library used to train GPT-J models.

Keywords: Haiku, model parallelism, LLM, TPUdeepchem

OpenNRE

A method for neural relationship extraction Open Source Packages (NRE). It targets a wide range of users, from novices, to developers, researchers or students.

Keywords: neural relationship extraction, framework

pycorrector

A Chinese text error correction tool. This method utilizes language model detection errors, pinyin features, and shape features to correct Chinese text errors. Can be used for Chinese Pinyin and stroke input methods.

Keywords: Chinese, error correction tools, language model, Pinyin

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##nlpaug

This python library can help you enhance nlp for machine learning projects. It is a lightweight library with functionality for generating synthetic data to improve model performance, supports audio and text, and is compatible with several ecosystems (scikit-learn, pytorch, tensorflow).

Keywords: data augmentation, synthetic data generation, audio, natural language processing

dream-textures

dream- textures is a library designed to bring stable diffusion support to Blender. It supports multiple use cases such as image generation, texture projection, in/out painting, ControlNet and upgrades.

Keywords: Stable-Diffusion, Blender

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone##seldon-core

Seldon core turns your ML models (Tensorflow, Pytorch, H2o, etc.) or language wrappers (Python, Java, etc.) into production REST/GRPC microservices. Seldon can handle scaling to thousands of production machine learning models and provides advanced machine learning features including advanced metrics, request logs, interpreters, outlier detectors, A/B testing, canaries, and more.

Keywords: microservices, modeling, language packaging

open_model_zoo

This library includes optimized deep learning models and a set of demos to accelerate the development of high-performance deep learning inference applications. Use these free pre-trained models instead of training your own to speed up development and production deployment processes.

Keywords: optimization model, demonstration

ml-stable-diffusion

ML-Stable-Diffusion is Apple’s A repository that brings Stable Diffusion support to Core ML on Apple silicon devices. It supports stable diffusion checkpoints hosted on Hugging Face Hub.

Keywords: Stable Diffusion, Apple chip, Core ML

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##stable-dreamfusion

Stable-Dreamfusion is a pytorch implementation of text to 3D model Dreamfusion, powered by Stable Diffusion text to 2D model.

Keywords: Text to 3D, Stable Diffusion

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##txtai

Txtai is an open source platform that supports semantic search and language model driven workflows. Txtai builds an embedded database, which is a combination of vector index and relational database, supporting SQL nearest neighbor search. Semantic workflows connect language models into unified applications.

Keywords: semantic search, LLM

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone##djl

Deep Java Library (DJL) is an open source, high-level, engine-agnostic Java framework for deep learning that is easy for developers to use. DJL provides native Java development experience and functions like other regular Java libraries. DJL provides Java bindings for HuggingFace Tokenizer and a simple conversion toolkit for deploying HuggingFace models in Java.

Keywords: Java, architecture

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestonelm-evaluation-harness

This project provides a unified framework to test generative language models on a large number of different evaluation tasks. It supports more than 200 tasks and supports different ecosystems: HF Transformers, GPT-NeoX, DeepSpeed, and OpenAI API.

Keywords: LLM, evaluation, few samples

gpt-neox

This resource library records the use of EleutherAI A library for training large-scale language models on GPUs. The framework is based on NVIDIA's Megatron language model and is enhanced with DeepSpeed's technology and some new optimizations. Its focus is on training models with billions of parameters.

Keywords: training, LLM, Megatron, DeepSpeed

muzic

Muzic is a research on artificial intelligence music project, which is able to understand and generate music through deep learning and artificial intelligence. Muzic was created by researchers at Microsoft Research Asia.

Keywords: music understanding, music generation

dalle-flow

DALL · E Flow is an interactive workflow for generating high-definition images from text prompts. It uses DALL·E-Mega, GLID-3 XL and Stable Diffusion to generate candidate images, and then calls CLIP-as-service to prompt sort the candidate images. Preferred candidates are fed to GLID-3 XL for diffusion, which often enriches textures and backgrounds. Finally, the candidate is extended to 1024x1024 via SwinIR.

Keywords: High-definition image generation, Stable Diffusion, DALL-E Mega, GLID-3 XL, CLIP, SwinIR

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

lightseq

LightSeq is a high-performance training and inference library implemented in CUDA for sequence processing and generation. It is capable of efficiently computing modern NLP and CV models such as BERT, GPT, Transformer, etc. Therefore, it is useful for machine translation, text generation, image classification, and other sequence-related tasks.

Keywords: training, inference, sequence processing, sequence generation

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

LaTeX- OCR

The goal of this project is to create a learning-based system that takes an image of a mathematical formula and returns the corresponding LaTeX code.

Keywords: OCR, LaTeX, mathematical formulas

open_clip

OpenCLIP is an open source implementation of OpenAI’s CLIP.

The goal of this repository is to enable the training of models with contrastive image-text supervision and to study their properties such as robustness to distribution shifts. The starting point of the project is an implementation of CLIP that matches the accuracy of the original CLIP model when trained on the same dataset.

Specifically, a ResNet-50 model trained on OpenAI’s 15 million image subset YFCC as the code base achieved the highest accuracy of 32.7% on ImageNet.

Keywords: CLIP, open source, comparison, image text

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

dalle-playground

A playground to generate images from any text prompt using Stable Diffusion and Dall-E mini.

Keywords: WebUI, Stable Diffusion, Dall-E mini

The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone

##FedML

FedML is a federated learning and analytics library that enables secure and collaborative machine learning on distributed data anywhere and at any scale.

Keywords: federated learning, analysis, collaborative machine learning, decentralized

The above is the detailed content of The star mark exceeded 100,000! After Auto-GPT, Transformer reaches new milestone. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete