Glossary Terms

Zero-Shot Learning

A machine learning approach where a model can correctly make predictions on classes it has never seen during training, by […]

Z-Score Normalization

A standardisation method that rescales data so that it has a mean of 0 and a standard deviation of 1.

XGBoost

An efficient and scalable implementation of gradient boosting, known for its high performance and flexibility. It is widely used in

Word Embedding

A technique in NLP that maps words or phrases into dense numerical vectors that capture semantic meaning. Examples include Word2Vec

Voting Classifier

An ensemble method that combines predictions from multiple models by taking the majority vote (classification) or average (regression) to make

Vectorization

The process of converting data, especially text or images, into numerical vectors that can be processed by machine learning algorithms.

Variance

The sensitivity of a model to fluctuations in the training data. High variance can cause the model to overfit and

Validation Set

A subset of the training data used to fine-tune model hyperparameters and monitor overfitting during training.

UnderSampling

A method to balance class distribution by reducing the number of samples in the majority class, which can improve performance

Transformer Model

A deep learning architecture based on self-attention mechanisms, widely used in NLP tasks. It powers models like BERT, GPT, and

error: Thank you for visiting! This content is protected. We appreciated your understanding.
Scroll to Top