Faculty of Commerce, Law and Management (ETDs)
Permanent URI for this communityhttps://hdl.handle.net/10539/37778
Browse
2 results
Search Results
Item Algorithmic pricing and its implications on competition law and policy in South Africa(University of the Witwatersrand, Johannesburg, 2023) Fowler, AshlyThe upsurge in the use of technology has proliferated the use of pricing algorithms which have become essential to e-commerce. Although South Africa had been privy to this shift prior to 2020, the onslaught of the Covid-19 pandemic exacerbated this shift. While the use of pricing algorithms in Competition law is accompanied by many pro-competitive benefits, it is also accompanied by various anti-competitive effects which include algorithmic-based collusion. Despite the fact that this topic has been addressed within the context of competition law in other jurisdictions, it has yet to be addressed from the viewpoint of the South African Competition Act 58 of 1998. Accordingly, the aim of this paper is to establish whether the Competition Act and South African competition policy at large, is robust enough to withstand the effects of digitalisation, particularly from the perspective of section 4 of the Competition Act which regulates relationships between competitors. In carrying out this analysis, this paper defines pricing algorithms and outlines their pro-competitive and anti-competitive effects.Thereafter, through the prism of four scenarios where pricing algorithms facilitate collusion, as posited by Ezrachi and Stucke in their seminal work on Virtual Competition, this paper establishes the robustness of the Competition Act by applying the scenarios to the Acts. Ultimately, this paper concludes that the current Competition Act (as amended) is in fact robust enough to tackle situations where algorithmic-based collusion arises. Where it is not, this paper argues that it is, at present, unnecessary for the relevant authorities to amend the current law or introduce any new lawsItem Bias in data used to train salesbased decision-making algorithms in a South African retail bank(2021) Wong, AliceBanks are increasingly using algorithms to drive informed and automated decision-making. Due to algorithms being reliant on training data for the model to learn the correct outcome, banks must ensure that the customer data is securely and fairly used when creating product offerings as there is a risk of perpetuating intentional and unintentional bias. This bias can result from unrepresentative and incomplete training data or inherently biased data due to past social inequalities. This study aimed to understand the potential bias found in the training data used to train sales-based decision-making algorithms used by South African retail banks to create customer product offerings. The research adopted a qualitative approach and was conducted through ten virtual one-on-one interviews with semi-structured questions. Purposive sampling was used to select banking professionals from data science teams in a particular South African retail bank across demographics and levels of seniority. The data collected from the participants in the interviews were then thematically analysed to draw a conclusion based on the findings. Key findings included: An inconsistent understanding across data science teams in a South African retail bank around the prohibition of using the gender variable. This could result in certain developers using proxy variables for gender to inform certain product offerings. A potential gap in terms of the potential usage of proxy variables for disability (due to non-collection of this demographic attribute) to inform certain product offerings. Although disability was not identified as a known biased variable, it did, however, raise the question of whether banks should be collecting the customer’s disability data and doing more in terms of social responsibility to address social inequalities and enable disabled individuals to contribute as effectively as abled individuals. As algorithms tend to generalise based on the majority’s requirements, this would result in a higher error rate of underrepresented groups of individuals or minority groups. This could result in financial exclusion or incorrect products being offered to certain groups of customers iii which, if not corrected, would lead to the continued subordination of certain groups of customers based on demographic attributes.