Top stories




Construction & EngineeringR700m sky-train proposal for Cape Town Waterfront faces public scrutiny
16 hours


More news






























There is a growing trend of employers using AI tools to monitor employee’s productivity, enhance its security and manage other matters such as remote working of employees. The question then that often arises is how far can an employer go before it crosses the line into infringing an employee’s right to privacy?
The right to privacy is enshrined in the Constitution of South Africa. However, like all rights, such rights can be limited if it is justifiable and reasonable to do so.
This article will briefly explore the tension between employer’s use of AI tools in the workplace and employee privacy.
Employers may introduce AI surveillance tools in the workplace, including but not limited to:
These tools can provide an employer with unique and very useful insights into its employee’s productivity but often contains large amounts of sensitive and personal information. The monitoring may even in very limited circumstances be extended to the monitoring of certain telephone calls made by an employee during work hours on the employer’s premises, as seen in the case of Protea Technology Ltd v Wainer [1997] (Protea Technology).
Section 14 of the Constitution guarantees the right to privacy, which has been interpreted to include the workplace. However, this right is not absolute and must be balanced against the employer’s legitimate business interests, such as productivity, data security and asset protection.
In the case of Numsa v Rafee N.O. [2016] (LC), after an employee captured photographs of a production process without permission and later refused to surrender his personal phone with those photographs, the Labour Court held that the employee’s right to privacy is not absolute and may be limited to uphold legitimate business security, particularly where confidential information or a breach of trust is involved.
Under PoPIA, and particularly section 71 which deals with automated decision making which may have a legal impact on an employee, employers must comply with various conditions in order to lawfully process employee information. This includes but is not limited to:
AI tools that monitor employees must fall within the boundaries of PoPIA. Accordingly using AI tools that secretly gather and process employee’s information without the employee knowledge would breach PoPIA.
Employers should explicitly have a policy that outlines the nature of the AI tools which they intend to use; the purpose for the collection of the data; the reason for the monitoring; and a clause that the employee consents to such intrusions.
This could be clearly set out in workplace policies or employment contracts where consent is sought from the employee prior to implementing such tools.
During Covid and following the pandemic, many employers implemented a hybrid remote working model, and therefore the use of digital AI surveillance has increased. Employers may use AI surveillance tools in order to monitor, inter alia, logins, meetings, mouse movement or time spent on various applications by their employees.
However, excessive intrusion into an employee’s home environment, even during work hours, may constitute a violation of dignity and privacy unless clear policies and guidelines are in place.
In SA Transport & Allied Workers Union on behalf of Assegai v Autopax [2001] (BCA), the employer made use of a private investigator to video record the actions of an employee, and the arbitrator weighed the employee’s right to privacy against the employer’s right to protect its property and economic interests.
Based on the facts of this case, the arbitrator ruled in favour of the employer. The arbitrator held that “what he [the employee] said to a passenger who wanted to buy a ticket, cannot be regarded as confidential or private.”
In the case of Afrox Ltd v Laka [1999] (LC) which dealt with the admission of video evidence which went to the heart of the employee’s defence, Zondo J set aside the arbitration award on the basis that the arbitrator’s refusal to admit such footage, constituted a gross irregularity in the proceedings.
Accordingly, while surveillance of employees in certain circumstances may be permissible, there are a number of factors that must be weighed up, including but not limited to, the reason for the surveillance and the employer’s right to protect its property and economic interests.
Importantly, when an employer intends to rely on video footage / surveillance of an employee, the employer must ensure that such material is authentic. As per the case of Numsa obo Ngxande v PFG Building Glass (Pty) Ltd [2020], it was held that the employer failed to prove on a balance of probability that the video material it introduced was authentic, resulting in the evidence being rejected.
The Commission for Conciliation, Mediation and Arbitration (CCMA) and Labour Courts may scrutinise how AI surveillance evidence is used in disciplinary proceedings. If AI surveillance is excessive or conducted secretly without justification, such evidence may be deemed inadmissible, and the process could be found to be procedurally unfair.
In order for an employer to remain compliant while using AI tools an employer should:
While the use of AI surveillance tools in the workplace can be useful and is not inherently unlawful, it must be balanced proportionately with the employees’ rights.