Photogrammetric Engineering and Remote Sensing, cilt.69, sa.11, ss.1225-1234, 2003 (SCI-Expanded)
Neural networks, recently applied to a number of image classification problems, are computational systems consisting of neurons or nodes arranged in layers with interconnecting links. Although there are a wide range of network types and possible applications in remote sensing, most attention has focused on the use of MultiLayer Perceptron (MLP) or FeedForward (FF) networks trained with a backpropagation-learning algorithm for supervised classification. One of the main characteristic elements of an artificial neural network (ANN) is the activation function. Nonlinear logistic (sigmoid and tangent hyperbolic) and linear activation functions have been used effectively with MLP networks for various purposes. The main objective of this study is to compare sigmoid, tangent hyperbolic, and linear activation functions through the one- and two-hidden layered MLP neural network structures trained with the scaled conjugate gradient learning algorithm, and to evaluate their performance on the multispectral Landsat TM imagery classification problem.