


This article summarizes the classic methods and effect comparison of feature enhancement & personalization in CTR estimation.
In CTR estimation, the mainstream method uses feature embedding MLP, where features are very critical. However, for the same features, the representation is the same in different samples. This way of inputting to the downstream model will limit the expressive ability of the model.
In order to solve this problem, a series of related work has been proposed in the field of CTR estimation, called feature enhancement module. The feature enhancement module corrects the output results of the embedding layer based on different samples to adapt to the feature representation of different samples and improve the expression ability of the model.
Recently, Fudan University and Microsoft Research Asia jointly released a review on feature enhancement work, comparing the implementation methods and effects of different feature enhancement modules. Now, let’s introduce the implementation methods of several feature enhancement modules, as well as the related comparative experiments conducted in this article
Title of the paper: A Comprehensive Summarization and Evaluation of Feature Refinement Modules for CTR Prediction
Download address: https://arxiv.org/pdf/2311.04625v1.pdf
1. Feature enhancement modeling idea
Feature enhancement module is designed to improve the CTR prediction model The expressive ability of the Embedding layer enables differentiation of representations of the same features in different samples. The feature enhancement module can be expressed by the following unified formula, input the original Embedding, and after passing a function, generate the personalized Embedding of this sample.
Picture
The general idea of this method is that after obtaining the initial embedding of each feature, use the representation of the sample itself to embedding the feature Make a transformation to get the personalized embedding of the current sample. Here we introduce some classic feature enhancement module modeling methods.
2. Classic method of feature enhancement
An Input-aware Factorization Machine for Sparse Prediction (IJCAI 2019) This article adds a reweight layer after the embedding layer, and inputs the initial embedding of the sample into A vector representing the sample is obtained in an MLP, and softmax is used for normalization. Each element after Softmax corresponds to a feature, representing the importance of this feature. This softmax result is multiplied by the initial embedding of each corresponding feature to achieve feature embedding weighting at sample granularity.
Picture
FiBiNET: Click-through rate prediction model combining feature importance and second-order feature interaction (RecSys 2019) also adopts a similar idea. The model learns a personalized weight of a feature for each sample. The whole process is divided into three steps: squeeze, extraction and reweight. In the squeezing stage, the embedding vector of each feature is obtained as a statistical scalar through the pooling method. In the extraction stage, these scalars are input into a multilayer perceptron (MLP) to obtain the weight of each feature. Finally, these weights are multiplied by the embedding vector of each feature to obtain the weighted embedding result, which is equivalent to filtering feature importance at the sample level
Picture
A Dual Input-aware Factorization Machine for CTR Prediction (IJCAI 2020) is similar to the previous article, and also uses self-attention to enhance features. The whole is divided into two modules: vector-wise and bit-wise. Vector-wise treats the embedding of each feature as an element in the sequence and inputs it into the Transformer to obtain the fused feature representation; the bit-wise part uses multi-layer MLP to map the original features. After the input results of the two parts are added, the weight of each feature element is obtained, and multiplied by each bit of the corresponding original feature to obtain the enhanced feature.
Image
GateNet: Enhanced gated deep network for click-through rate prediction (2020) utilizes the initial embedding vector of each feature through an MLP and sigmoid The function generates its independent feature weight scores while using an MLP to map all features into bitwise weight scores, combining the two to weight the input features. In addition to the feature layer, in the hidden layer of MLP, a similar method is also used to weight the input of each hidden layer
Picture
Interpretable Click-Through Rate Prediction through Hierarchical Attention (WSDM 2020) also uses self-attention to achieve feature conversion, but adds the generation of high-order features. Hierarchical self-attention is used here. Each layer of self-attention takes the output of the previous layer of self-attention as input. Each layer adds a first-order high-order feature combination to achieve hierarchical multi-order feature extraction. Specifically, after each layer performs self-attention, the generated new feature matrix is passed through softmax to obtain the weight of each feature. The new features are weighted according to the weights of the original features, and then a dot product is performed with the original features to achieve an increase of one feature. Characteristic intersection of levels.
Picture
ContextNet: A Click-Through Rate Prediction Framework Using Contextual information to Refine Feature Embedding (2021) is a similar approach, using an MLP to All features are mapped into a dimension of each feature embedding size, and the original features are scaled. The article uses personalized MLP parameters for each feature. In this way, each feature is enhanced using other features in the sample as upper and lower bits.
Picture
Enhancing CTR Prediction with Context-Aware Feature Representation Learning (SIGIR 2022) uses self-attention for feature enhancement, for a set of input features , each feature has a different degree of influence on other features. Through self-attention, self-attention is performed on the embedding of each feature to achieve information interaction between features within the sample. In addition to the interaction between features, the article also uses MLP for bit-level information interaction. The new embedding generated above will be merged with the original embedding through a gate network to obtain the final refined feature representation.
Picture
3. Experimental results
After comparing the effects of various feature enhancement methods, we came to the overall conclusion: Among the many feature enhancement modules, GFRL, FRNet-V, and FRNetB perform best and are better than other feature enhancement methods
## picture
The above is the detailed content of This article summarizes the classic methods and effect comparison of feature enhancement & personalization in CTR estimation.. For more information, please follow other related articles on the PHP Chinese website!

The NVIDIA AI Summit 2024: A Deep Dive into India's AI Revolution Following the Datahack Summit 2024, India gears up for the NVIDIA AI Summit 2024, scheduled for October 23rd-25th at the Jio World Convention Centre in Mumbai. This pivotal event prom

Introduction Imagine a fast, simple database engine—no configuration needed—that integrates directly into your applications and offers robust SQL support without a server. That's SQLite, widely used in applications and web browsers for its ease of u

Get Roasted by an AI! A Hilarious Dive into Wordware AI YouTube roast videos are hugely popular, but have you ever been roasted by artificial intelligence? I recently experienced the comedic wrath of Wordware AI, and it was a hilariously humbling ex

Introduction Efficient software development hinges on a strong understanding of algorithms and data structures. Python, known for its ease of use, provides built-in data structures like lists, dictionaries, and sets. However, the true power is unlea

Violin Plots: A Powerful Data Visualization Tool This article delves into violin plots, a compelling data visualization technique merging box plots and density plots. We'll explore how these plots unveil data patterns, making them invaluable for dat

Advanced Python for Data Scientists: Mastering Classes, Generators, and More This article delves into advanced Python concepts crucial for data scientists, building upon the foundational knowledge of Python's built-in data structures. We'll explore

SQL Query Interpretation Guide: From Beginner to Mastery Imagine you are solving a puzzle where every SQL query is part of the image, and you are trying to get the complete picture from it. This guide will introduce some practical methods to teach you how to read and write SQL queries. Whether you look at SQL from a beginner's perspective or from a professional programmer's perspective, interpreting SQL queries will help you get answers faster and easier. Start exploring and you will soon realize how SQL usage revolutionizes the way you think about databases. Overview Master the basic structure of SQL query. Interpret various SQL clauses and functions. Analyze and understand complex SQL queries. Efficient debugging and excellent

A Groundbreaking Paper on Dataset Diversity in Machine Learning The machine learning (ML) community is abuzz over a recent ICML 2024 Best Paper Award winner that challenges the often-unsubstantiated claims of "diversity" in datasets. Resea


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

SublimeText3 Chinese version
Chinese version, very easy to use

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

Zend Studio 13.0.1
Powerful PHP integrated development environment

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

Atom editor mac version download
The most popular open source editor