The Information and Communication Technologies Authority (ICTA) acts internationally as the national regulatory body of Mauritius in respect of information and communication technologies matters. The ICTA also implements the policy of government relating to the information and communication industry. Another function of the ICTA is to take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services. The ICTA published a consultation paper on proposed amendments to the ICT Act for regulating the use and addressing the abuse and misuse of Social Media in Mauritius.
The Consultation Paper provided some background information about measures on illegal content which were taken in Germany, the United Kingdom, France, the European Union, Australia and India.
In 2017, the German Bundestag adopted an Act to "Improve Enforcement of the Law in Social Networks" (Network Enforcement Act, NetzDG). The Act is applicable to "telemedia service providers which, for profit-making purposes, operate internet platforms which are designed to enable users to share any content with other users or to make such content available to the public (social networks)." The Act specifies that platforms offering journalistic or editorial content are not considered as social media given that the service providers are responsible for the content on those platforms. NetzDG also sets reporting obligations for social networks receiving more than 100 complaints per year about unlawful content. NetzDG also sets a requirement for social networks to maintain an effective and transparent procedure for handling complaints about unlawful content.
A research report by William Echikson and Olivia Knodt of the Counter Extremism Project concluded that it remains "uncertain whether NetzDG has achieved significant results in reaching its stated goal of preventing hate speech". A fundamental question in the debate about NetzDG was: "Who should be held accountable for content once it is deemed illegal by national law?"
The government of the United Kingdom issued an Online Harms White Paper for consultation. An outcome of the consultation, the government will propose legislation which defines what harmful content will be in scope. The legislation will also aim at holding "tech giants to account for the way they address this content on their platforms". The aim is also to keep regulation proportionate; fewer than 3% of UK businesses will be in scope. In addition, content published by newspapers on their sites will be outside the scope of the regulatory framework. The enforcement action being proposed is to issue fines for non-compliance.
In France, there is a law to fight against hate content on the Internet. The law specifies a new regime for platform operators with high traffic. It also requires online platforms to remove content related to terrorism and child pornography and hate content which is «manifestement illicites» (obviously illicit). The Conseil constitutionnel ruled that the some parts of the law was a disproportionate infringement of freedom of expression.
The European Union adopted a regulation on addressing the dissemination of terrorist content online. The rules are to address the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market. The regulation states that the rules should fully respect fundamental rights, including the right to respect for private life, to the protection of personal data, to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and to an effective remedy, which are protected in the European Union.
The Consultation Paper argues that the above-mentioned regulations cannot be implemented or would be ineffective in Mauritius as online platforms do not have a physical presence in the country. It then concludes that technical enforcement measures would be required to monitor incidents which are currently reported through the Mauritian Cybercrime Online Reporting System. The ICTA also highlighted that there is already a technical toolset which has been deployed to block material which is an offence under the Child Protection Act (Mauritius).
There is a Mauritian Cybercrime Online Reporting System for incident reporting. The published guidance is to report following types of incidents through the system: sextortion, identity theft, online scams, fraud and phishing. There is guidance to report the following types of incidents to the service (or social media company): online harassment, hacking, offensive and illegal content, cyberbullying and cyberstalking. The statistics in the Consultation Paper could lead to underestimating the amount of incidents, assuming that the common person follows the published guidance.
The Regulatory and Operational Framework proposes setting up:
The Consultation Paper concludes that the framework will undoubtedly interfere with fundamental rights, such as the rights to privacy and confidentiality and freedom of expression.
Section 8 of the Consultation Paper proposes that the National Digital Ethics Committee be given the same powers as a public operator to intercept or withhold messages which, if the public operator has reason to believe that the message is indecent or abusive, in contravention of the law, or might be of a nature which is likely to endanger or compromise national security, public safety or public order. Furthermore, the National Digital Ethics Committee will be given powers to investigate illegal and harmful content and decide whether, in its opinion, the online content is illegal and harmful. The Section also states that the National Digital Ethics Committee will have to follow a modus operandi which is compliant with the Data Protection Act 2017.
The Consultation Paper states in Section 11.2.1 that incoming and outgoing traffic in Mauritius will have to be segregated and social media traffic will be routed to a proxy server. It is also stated that all social media traffic will be decrypted. The capabilities sought, in simple terms, is store all social media content, to block specific content, and to identify the IP address from which the content originated. The proposed technology requires service providers carrying traffic into and out of Mauritius to forward all social media-related traffic through the "technical toolset" provider. Internet users in Mauritius will have to install a government-issued digital certificate on their computers and mobile phones to be able to access social media content.
Offences related to social media are handled, in Mauritius, as a breach of the Information and Communication Technologies Act because of the definition of «information and communication technologies» in Section 2 of the Act. There is case law which points to the element of causing annoyance as constituting an offence under Section 46 of the Act. In some cases the court relied on the dictionary definition of the word «annoyance» because the Act does not define the word. In other words, it is left to the court to interpret the meaning of the word. The lack of a definition causes uncertainty and puts a common person using social media at a significant risk given that it is difficult for the person to know what is permissible.
The Kenyan High Court struck off a similar provision in a Kenyan Information and Communication law on the grounds that individuals do not know the parameters within which their communication falls, and the provisions therefore offend against the rule requiring certainty in legislation that creates criminal offences.
The penalty for committing an offence under the Kenyan law is a fine of up to USD 500 or a term of imprisonment not exceeding three months or both. The penalty in Mauritius is, in comparison, a fine which is up to fifty times the amount and a term of penal servitude which can be up to forty times more.
It is not clear from the Consultation Paper whether Sections 46 (ga) and 46 (ha) of the Information and Communication Technologies Act, which were introduced in 2018, are used mainly because they do not require digital forensic evidence to impose charges on an offender.
Facebook commissioned a human rights impact assessment of the company's presence in Myanmar. One of the issues which was identified in 2018 was that legal provisions frequently use vague and inconsistent terms,and these are applied broadly to limit freedom of expression. One of the disadvantages on the company having a local presence was that it may increase government leverage over content restrictions and data requests by allowing them to threaten seizure of Facebook’s IT equipment or data or place Facebook staff at safety risk.
There wasn't any government request through a legal process from Mauritius for Facebook user information over the past two years. The jurisdiction in which Facebook is incorporated describes the Mauritius judiciary as independent and the domestic legal system is generally non-discriminatory and transparent. The United States and Mauritius have a bilateral extradition agreement since 1935; there are also other agreements between the two countries.
The Republic of Kazakhstan enacted a law in 2019 requiring Internet Service Providers (ISPs) to provide a digital certificate to their users to protect the users' personal data. Google blocked the digital certificate in Chrome, a well-known web browser, in response to the action. Mozilla blocked the digital certificate of the Information Security Certification Authority CA, which was issued by the government of Kazakhstan in 2020.
It could be said that Facebook is the face of the Internet in Mauritius as over two-thirds of the citizens of Mauritius are on that social media network. Anecdotal evidence shows that Facebook is used as a medium to express political opinions and by businesses to market their products and services. Facebook is also the premier dating site used by young adults in Mauritius.
It is assumed that young adults will share personal data during their romantic endeavours. The proposed technical toolset will obviously collect and store a significant amount of personal data. The modus operandi is disproportionate from a data minimization perspective. It is doubtful whether the Data Protection Act 2017 (Mauritius) provides adequate protection in this case given the exception in Section 23(3)(b)(ii) of the Act. The protection for privacy of home and other property is enshrined in Article 9 of the Constitution of Mauritius. However, Article 9 of the Constitution enshrines a segmented right to privacy and may not be applicable.
The technical information available in the Consultation Paper was used to develop a proof of concept. The proof of concept did not include changes to the routing policy of social media networks or other networks. The assumption was that it is done by injecting routes in the BGP (Border Gateway Protocol) tables of Internet service providers in Mauritius. The tests carried out point to a negative impact on the confidentiality of communication as it was possible for a third-party to decrypt login credentials. The tests also showed that there was also a risk to the integrity of the data exchanged over the communication channel.
The Consultation Paper does not provide much insight into online content which is harmful or illegal. It is suggested to include anonymized information about injured parties to increase awareness about the consequences of harmful or illegal content. The information will also be useful for research on social media misuse in Mauritius.
It can be surmised from the Consultation Paper that the stumbling block is the identification of the online offender because social media companies do not have a presence in Mauritius. The Consultation Paper unfortunately does not include any explanation about the absence of legal requests in regards to social media offences.
It is unlikely that self-regulation is working or else online content which may be in breach of local law would have been flagged as violation of the community standards of the online platform. However, there isn't any explanation in the Consultation Paper about why the self-regulation is not working except for a mention that abusive online content is posted in native language.
Some of the content on social networks could be unpleasant or, at worst, harmful. That content could be written or spoken in a language which is not understood by the review teams of social media companies. Some of it is probably in violation of the community standards of social media companies. It is recommended that social media companies publish metrics on the violation of their community standards by country so that the effectiveness of their enforcement mechanisms can be assessed at a country level. Social media companies should be transparent about whether the content being reviewed is in a language which they do not understand.
The proposed privacy infringement on a large-scale is based on the argument that social media companies do not have a presence in Mauritius. The risk of privacy violations is significant. The ICTA stated in the Consultation paper that "a handful of giant monopolies are bent on collecting users’ personal data". The proposed modus operandi adopts a similar approach without including a remedy in case of a privacy violation.
The National Digital Ethics Committee is defined both as an investigative body and an adjucating body. It is suggested that adjucation be within the remit of the judiciary to ensure public confidence.
None of the other countries required overriding the routing policy of social network companies in their laws about illegal content on social media. Regulating routing policy at a national level is a double-edge sword as it opens the door for other countries to reciprocate. Such a regulation is considered as a risk from a cyber security perspective. It is recommended to leave it to service providers to decide about their routing policies.
It is highly improbable that the technical enforcement proposed in the Consultation Paper will be successful given that web browsers have a history of blocking digital certificates issued for technology-based regulation purposes.
It is not possible to address a societal issue solely with a technological solution. Mauritius could consider encouraging local businesses and non-governmental organizations to cooperate on finding solutions to some of the types of incidents which it identified.
1. "Consultation Paper on proposed amendments to the ICT Act for regulating the use and addressing the abuse and misuse of Social Media in Mauritius", Information and Telecommunication Technologies Authority, April 2021
2. "Act to Improve Enforcement of the Law in Social Networks", Bundestag, July 2017
3. "Germany’s NetzDG: A key test for combatting online hate", William Echikson and Olivia Knodt, November 2018
4. Consultation outcome - Online Harms White Paper: Full government response to the consultation, Secretary of State for Digital, Culture, Media and Sport and the Secretary of State for the Home Department, December 2020
5. LOI no 2020-766 du 24 juin 2020 visant à lutter contre les contenus haineux sur internet, Légifrance, June 2020
6. AVIS SUR LA PROPOSITION DE LOI visant à lutter contre la haine sur Internet, CONSEIL D’ETAT, May 2019
7. REGULATION (EU)2021 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on addressing the dissemination of terrorist content online, Council of the European Union, March 2021
8. ICTA and IWF sign MoU and launch portal to protect children from illegal online content, Information and Communication Technologies Authority, October 2013
9. IN THE HIGH COURT OF KENYA AT NAIROBI MILIMANI LAW COURTS CONSTITUTIONAL AND HUMAN RIGHTS DIVISION PETITION NO 149 OF 2015, Republic of Kenya, April 2015
10. Human Rights Impact Assessment: Facebook in Myanmar, BSR, 2018
11. Transparency, Facebook, Accessed in May 2021
12. 2020 Investment Climate Statements: Mauritius, United States Department of State, Accessed in May 2021
13. What is Qaznet certificate and is it secure, factcheck.kz, July 2019
14. Google Git changes, Google, August 2019
15. MITM in Kazakhstan, Mozilla, December 2020
16. Does the Mauritian Constitution protect the right to privacy? An insight from Madhewoo v The State of Mauritius, African Human Rights Law Journal, 2018