ETD Collection
Permanent URI for this collectionhttps://wiredspace.wits.ac.za/handle/10539/104
Please note: Digitised content is made available at the best possible quality range, taking into consideration file size and the condition of the original item. These restrictions may sometimes affect the quality of the final published item. For queries regarding content of ETD collection please contact IR specialists by email : IR specialists or Tel : 011 717 4652 / 1954
Follow the link below for important information about Electronic Theses and Dissertations (ETD)
Library Guide about ETD
Browse
5 results
Search Results
Item Augmentative topology agents for open-ended learning(2024) Nasir, Muhammad UmairWe tackle the problem of open-ended learning by improving a method that simultaneously evolves agents and increasingly challenging environments. Unlike previous open-ended approaches that optimize agents using a fixed neural network topology, we hypothesize that the open-endedness and generalization can be improved by allowing agents’ controllers to become more complex as they encounter more difficult environments. Our method, Augmentative Topology EPOET (ATEP), extends the Enhanced Paired Open-ended Trailblazer (EPOET) algorithm by allowing agents to evolve their own neural network structures over time, adding complexity and capacity as necessary. Empirical results demonstrate that ATEP results in open-ended and general agents capable of solving more environments than a fixed-topology baseline. We also investigate mechanisms for transferring agents between environments and find that a species-based approach further improves the open-endedness and generalization of agents.Item Modelling implied volatility in South African stock options: a comparison of statistical and machine learning methods(2024) Harmse, MarikeThe Black-Scholes model is used to derive the price of an option. Two of the underlying assumptions is that of constant volatility and normality in stock price returns. Volatility is often modelled using various statistical and machine learning methods. This research focused on the use of GARCH, EGARCH, APARCH and LSTM models to predict the volatility underlying the All Share Index, a South African stock index. Exploratory data analysis indicated that the ALSI exhibited the leverage effect, long memory properties and asymmetry in its returns and that the traditional models such as ARCH and GARCH may not be sufficient to model stock price volatility. These models will therefore be used as benchmark models against the APARCH, EGARCH and LSTM models. The ARMA(3,3)-EGARCH(1,1) model outperformed the other models considered. While LSTM models can add value in the estimation of stock price volatility, they are highly sensitive to the selected hyperparameters and architecture used.Item Country risk analysis: an application of logistic regression and neural networks(2017) Ncube, GugulethuCountry risk evaluation is a crucial exercise when determining the ability of countries to repay their debts. The global environment is volatile and is filled with macro-economic, financial and political factors that may affect a country’s commercial environment, resulting in its inability to service its debt. This re search report compares the ability of conventional neural network models and traditional panel logistic regression models in assessing country risk. The mod els are developed using a set of economic, financial and political risk factors obtained from the World Bank for the years 1996 to 2013 for 214 economies. These variables are used to assess the debt servicing capacity of the economies as this has a direct impact on the return on investments for financial institu tions, investors, policy makers as well as researchers. The models developed may act as early warning systems to reduce exposure to country risk. Keywords: Country risk, Debt rescheduling, Panel logit model, Neural net work modelsItem Dynamic neural network-based feedback linearization of electrohydraulic suspension systems(2014-09-11) Dangor, MuhammedResolving the trade-offs between suspension travel, ride comfort, road holding, vehicle handling and power consumptions is the primary challenge in designing Active-Vehicle-Suspension-Systems (AVSS). Controller tuning with global optimization techniques is proposed to realise the best compromise between these con icting criteria. Optimization methods adapted include Controlled-Random-Search (CRS), Differential-Evolution (DE), Genetic-Algorithm (GA), Particle-Swarm-Optimization (PSO) and Pattern-Search (PS). Quarter-car and full-car nonlinear AVSS models that incorporate electrohydraulic actuator dynamics are designed. Two control schemes are proposed for this investigation. The first is the conventional Proportional-Integral-Derivative (PID) control, which is applied in a multi-loop architecture to stabilise the actuator and manipulate the primary control variables. Global optimization-based tuning achieved enhanced responses in each aspect of PID-based AVSS performance and a better resolve in con icting criteria, with DE performing the best. The full-car PID-based AVSS was analysed for DE as well as modi ed variants of the PSO and CRS. These modified methods surpassed its predecessors with a better performance index and this was anticipated as they were augmented to permit for e cient exploration of the search space with enhanced exibility in the algorithms. However, DE still maintained the best outcome in this aspect. The second method is indirect adaptive dynamic-neural-network-based-feedback-linearization (DNNFBL), where neural networks were trained with optimization algorithms and later feedback linearization control was applied to it. PSO generated the most desirable results, followed by DE. The remaining approaches exhibited signi cantly weaker results for this control method. Such outcomes were accredited to the nature of the DE and PSO algorithms and their superior search characteristics as well as the nature of the problem, which now had more variables. The adaptive nature and ability to cancel system nonlinearities saw the full-car PSO-based DNNFBL controller outperform its PID counterpart. It achieved a better resolve between performance criteria, minimal chatter, superior parameter sensitivity, and improved suspension travel, roll acceleration and control force response.Item Computational intelligence techniques for missing data imputation(2008-08-14T12:08:53Z) Nelwamondo, Fulufhelo VincentDespite considerable advances in missing data imputation techniques over the last three decades, the problem of missing data remains largely unsolved. Many techniques have emerged in the literature as candidate solutions, including the Expectation Maximisation (EM), and the combination of autoassociative neural networks and genetic algorithms (NN-GA). The merits of both these techniques have been discussed at length in the literature, but have never been compared to each other. This thesis contributes to knowledge by firstly, conducting a comparative study of these two techniques.. The significance of the difference in performance of the methods is presented. Secondly, predictive analysis methods suitable for the missing data problem are presented. The predictive analysis in this problem is aimed at determining if data in question are predictable and hence, to help in choosing the estimation techniques accordingly. Thirdly, a novel treatment of missing data for online condition monitoring problems is presented. An ensemble of three autoencoders together with hybrid Genetic Algorithms (GA) and fast simulated annealing was used to approximate missing data. Several significant insights were deduced from the simulation results. It was deduced that for the problem of missing data using computational intelligence approaches, the choice of optimisation methods plays a significant role in prediction. Although, it was observed that hybrid GA and Fast Simulated Annealing (FSA) can converge to the same search space and to almost the same values they differ significantly in duration. This unique contribution has demonstrated that a particular interest has to be paid to the choice of optimisation techniques and their decision boundaries. iii Another unique contribution of this work was not only to demonstrate that a dynamic programming is applicable in the problem of missing data, but to also show that it is efficient in addressing the problem of missing data. An NN-GA model was built to impute missing data, using the principle of dynamic programing. This approach makes it possible to modularise the problem of missing data, for maximum efficiency. With the advancements in parallel computing, various modules of the problem could be solved by different processors, working together in parallel. Furthermore, a method for imputing missing data in non-stationary time series data that learns incrementally even when there is a concept drift is proposed. This method works by measuring the heteroskedasticity to detect concept drift and explores an online learning technique. New direction for research, where missing data can be estimated for nonstationary applications are opened by the introduction of this novel method. Thus, this thesis has uniquely opened the doors of research to this area. Many other methods need to be developed so that they can be compared to the unique existing approach proposed in this thesis. Another novel technique for dealing with missing data for on-line condition monitoring problem was also presented and studied. The problem of classifying in the presence of missing data was addressed, where no attempts are made to recover the missing values. The problem domain was then extended to regression. The proposed technique performs better than the NN-GA approach, both in accuracy and time efficiency during testing. The advantage of the proposed technique is that it eliminates the need for finding the best estimate of the data, and hence, saves time. Lastly, instead of using complicated techniques to estimate missing values, an imputation approach based on rough sets is explored. Empirical results obtained using both real and synthetic data are given and they provide a valuable and promising insight to the problem of missing data. The work, has significantly confirmed that rough sets can be reliable for missing data estimation in larger and real databases.