Supervisor
Dr Muhammad Iqbal
Programme
MSc in Data Analytics
Subject
Computer Science
Abstract
Even on the era of Big Data, small datasets are the reality of many companies and sectors. Many datasets in rare disease diagnosis, custom manufacturing, military sciences, bioengineering, and disaster events are commonly limited in size, making machine learning predictive modelling difficult. Being the Activation Function choice crucial for Neural Networks learning, it raises the question of their effectiveness in such scenarios. This study compares five standard (single) activation functions (Sigmoid, Tanh, ReLU, Leaky ReLU, ELU) and two hybrid variants (one a mix of ReLU plus Tanh and a Learnable Activation Function with a trainable weight (alpha) that balances ReLU and Tanh in each layer) in feed-forward neural networks on one small dataset (Iris) and two large datasets (Adult, Adult Expanded). The architecture is constant and different activations, optimizers (SGD, Adam, RMSprop), scalers (Standard, Min-Max, Robust), batch sizes (16, 32) and epochs (50, 100) were used as parameters creating 756 models. Performance is evaluated via accuracy and loss convergence. Hybrids (fixed ReLU and Tanh blend, Learnable Hybrid) consistently match or outperform single activations for the small dataset Iris where they achieve 100% test accuracy, and on Adult/Adult Expanded they rank among the top models (~0.853 test accuracy) and the early-loss curves show faster reductions in the first half of training for hybrid activation functions. The findings suggest hybrid activations are a top choice when data are scarce (small datasets) and a strong rule-of-thumb for large datasets as well. The findings are aligned to Kavun (2025) where the hybrid activations outperformed single activation functions. This open discussions for practical guidance and avenues for broader benchmarking when using small datasets. Neural Networks are complex with a multitude of parameters that can be explored like architectures, data types, parameters tunning and various other combinations of activation function ensembles.
Date of Award
2025
Full Publication Date
2025
Access Rights
open access
Document Type
Dissertation
Resource Type
thesis
Recommended Citation
Martins, A. C. (2025) Neural Networks Activation Functions and Hybrid Activations Functions accuracy and loss comparison on small dataset against large datasets for classification problems CCT College Dublin.