Challenges of Using Risk-Based Approaches to Anchor Epistemic Data Justice for Rural Women in Tanzania and Kenya

Challenges of  Using Risk-Based Approaches to Anchor Epistemic Data Justice for Rural Women in Tanzania and Kenya

Introduction

Increased access to affordable internet and smartphones, the reach of mobile broadband signals, and investment in infrastructure have contributed to digital transformation in Africa. However, gender-specific challenges, such as inequalities in access to technology and digital literacy, are still persistent.1 Currently, the Digital Transformation Strategy for Africa 2020-2030 aims to ensure that the digital transformation bridges the gender and rural-urban divides.2 As part of the strategy, there has been an increase in the adoption of technologies for rural women in Africa. In the agricultural sector, for example, targeted rural women in Africa are increasingly innovating and using technologies. These technologies involve mobile applications for contract farming, weather forecasting, and picture-based extension services. Some programmes that have adopted these technologies include Jembe in Tanzania and Hello Tractor in Kenya. Through these technologies, information obtained from or created by rural women is kept in databases and further processed.3

However, there is a catch. Designers of technologies used by rural women in Africa often fail to consider the women’s ways of knowing.4 The phenomenon slows the much-desired integration of indigenous knowledge of rural women into new technologies. It further affects data protection risk assessment practices by data controllers who process personal data relating to rural women. Evidence shows that the phenomenon’s impact across the technology lifecycle exposes rural women to threats of misuse of their data.5 These threats pose risks to the right to women’s privacy and privacy-related rights and could result in other economic, social, physical or tangible harms.6 The first blog article in this series has shown that the stated threats could result in ‘high’ or ‘significant’ risks of harm or other legal effects to rural women in their knowledge contexts. It also showed how denial of rural women’s knowledge context could result in risks of vulnerability, exclusion, and discrimination. The blog, therefore, emphasized the need to consider and apply additional principles of epistemic data justice in data governance for rural women in Africa.

This blog article advances the discussions in the first blog article. It looks at how the law envisages data controllers and processors using risk-based approaches to anchor epistemic data justice principles.7 It then focuses on practical compliance challenges that data controllers or processors may encounter when using risk-based approaches to address epistemic data injustice affecting rural women. The analysis uses examples of risk assessment regimes such as personal data breach management and impact assessment. The article concludes that legal challenges related to definition, text and practice could negatively impact the potential of risk-based approaches in practice. Consequently, the article proposes a pathway for overcoming the identified challenges.

How Law Positions Risk-Based Approaches to Achor Epistemic Data Justice

African regional data protection instruments position risk-based approaches to address epistemic data injustices arising from processing personal information.8 The Malabo Convention 2014, The Personal Data Protection Guidelines for Africa 2018, and The Digital Transformative Strategy for Africa 2020-2030 provide bases for these approaches. These instruments require data controllers and processors to identify, assess, and mitigate epistemic data injustices.

Risk-based approaches also permeate legal instruments applicable in Tanzania and Kenya at the domestic levels. Concerning Tanzania, section 2 of Tanzania’s Personal Data Protection Act adopts a risk-based approach to categorizing sensitive personal data. The section categorizes such data based on whether their processing presents a major risk to the rights and interests of the data subject. Section 27(2)(b) of the Act also provides that risks to data subjects’ rights are a key factor that should guide the appropriateness of security safeguards. Under section 37(3) of the Act, the risk posed to data subjects is considered when assessing compensation due to data subjects in case of violation of the law. Further, Regulations 27 and 33(1) of the Personal Data Protection (Personal Data Collection and Processing) Regulations 2023 also require security safeguards and data protection impact assessment (DPIA) to be implemented through the risk-based approach.

In Kenya, section 47(2)(a) of the Data Protection Act 2019 provides that the risk of significant harm is a crucial factor that influences the categorization of sensitive personal data. Again, section 8(1)(j) of the Act mandates the Office of the Data Protection Commissioner (ODPC) to ensure that developments in processing personal data do not result in significant privacy risks.9 Under section 19(2)(e) of the Act, every application for registration of data controllers and data processors must describe the risks and safeguards measures. Further, impact assessment conducted under section 31 with the impact assessment is only mandatory where the processing operation poses high risks to the rights and freedoms of data subjects. Also, sections 41 and 42 of the Act envisage that establishing and maintaining appropriate safeguards against the identified risks is vital to data protection by default and design. Lastly, section 43 of the Act describes that only a security incident that presents a real risk of harm to the data subject is considered a personal data breach.

Possible Legal Compliance Challenges Associated with the Positioning

Ideally, the framework of a risk-based approach to data protection10 should eliminate the risk of epistemic data injustice against rural women throughout the technology lifecycle. Realizing this goal is only possible if the legal design of risk assessment regimes has the full potential to identify, assess, prioritize, and eliminate risks of epistemic data injustices that rural women could encounter. However, the capacity of the risk-based approaches to realize the stated potential is limited for the reasons attributable to the following:

  1. Restricted definition of ‘personal data’

The Malabo Convention 2014 and data protection laws in Tanzania and Kenya define personal data as any information related to an identified or identifiable person. The definition has limitations when applied to contexts of rural women that usually favour group privacy. For example, smallholder rural women farmers who utilize contract farming applications often own and create information in groups rather than as individuals.11 In such a case, the definition of personal data may not cover women’s group knowledge of crop production and farm inputs. Dagne has noted that the impossibility of extending the definition to cover farm data,12 is itself an injustice. In effect, the limited scope of the definition may be counter-productive in redressing epistemic data injustices for rural women.

  1. Non-provision for the involvement of rural women in the risk assessment process

Generally, access and literacy challenges cause women to have low levels of participation in digital innovations.13 Participation is even lower for rural women who face additional structural barriers such as patterns of gender exclusion in school enrolment, access to cellular networks and other digital infrastructure.14 The potential of the risk assessment regimes only goes so far since they do not mandate the proactive involvement of rural women or capture their perspectives on epistemic data injustices. This gap gives data controllers and processors more discretion to satisfy compliance obligations.

On the DPIA obligation, section 31 of the Kenyan Data Protection Act 2019, as read with the Kenyan Data Protection (General) Regulations 2021 and Regulations 33 and 34 of the Tanzanian Personal Data Protection (Personal Data Collection Processing) Regulations 2023, do not mandatorily require data controllers or data processors to involve rural women in the assessment process. The apparent limitation in the Kenyan approach could be remedied by requiring the data controllers to consider the context of processing operations.15 This way, the rich socio-cultural knowledge of the rural women could form part of the context. However, the Data Protection Regulations do not specify the remedy that requires data controllers to consider the context of processing when conducting or determining whether to conduct a DPIA.16

  1. Possible biased positionality of assessors

Assessors are in-house or outsourced individuals or entities that conduct risk assessment or management procedures.17 Practice has shown that multi-disciplinary teams of assessors are usually drawn from information communication technology (ICT) and related professions.18 Urban-based male professionals usually dominate these professions.19 For example, in Tanzania, women account for only 25% of teams working in ICT, while Kenya’s is slightly higher at 30%. These male-dominated assessment teams are tasked to assess the real risk of harm to rural women in case of personal data breaches, for example. Therefore, assessors can meet the legally mandated standard for risk assessment even when they prioritize technical expertise over women’s perspectives and knowledge contexts. In some cases, the inclination of male domination could also come with other underlying gender bias that increase the chances of ignoring the rural women’s knowledge in the risk assessment process.

  1. Inadequacies in risk assessment tools

Designers of digital technologies could use templates and best practices to be aware of unique epistemic risks, vulnerabilities, and nuanced issues affecting rural women. However, the forms and templates used in Tanzania and Kenya are inadequate in opening up the designers of new technologies to both rural women’s ways of knowing and perception of risks. For example, the templates for conducting a DPIA in Form 9 in the First Schedule to the Tanzanian Personal Data Protection (Personal Data Collection and Processing) Regulations and the annexure to the Kenyan Guidance Note on DPIA use generic terms. A data controller can fill out and submit the DPIA report to regulators even when risks posed to rural women in their knowledge contexts are neither considered nor reported.

  1. Structural challenges to access to justice

Data protection law in Kenya and Tanzania offers complaint mechanisms for persons aggrieved by non-compliance with prescribed risk assessment regimes.20 The rules on standing and practices on access impact how rural women experience and use these mechanisms to remedy epistemic data injustices. Regarding standing, Tanzanian Data Protection Regulations has taken a more forward-thinking approach, allowing interested persons to lodge complaints for rural women.21 Though the approach also applies to Kenya considering Regulation 4(3)(b) of the Complaint Handling Regulations, the ODPC has interpreted this in a somewhat restrictive fashion. Right now, only the impacted rural women who qualify as data subjects or only through a fiduciary22 can complain to the regulator.23 Regarding access to the regulator’s offices, the Kenyan ODPC has operationalized its Strategic Plan by devolving its offices to key regions in the country. The Tanzanian Personal Data Protection Commission (PDPC) has yet to operationalize comparative devolution plans as it is still in its nascent stages of operation. However, reaching out to the PDPC via various electronic platforms is still possible.

Implications of the Challenges

Legal risk assessment regimes in Kenya and Tanzania are good starting points for realizing the principles of the epistemic data justice framework for rural women in Africa discussed in the previous blog article. However, they alone cannot fully guarantee the desired epistemic justice goals. Amidst all these, the buck should stop with the States of Tanzania and Kenya since they recognize and undertake to give effect to privacy and related rights in the African Charter on Human and Peoples Rights.24

Furthermore, the discussion on the legal compliance challenges has shown the ‘irony of regulation’ where laws enacted by States to tackle epistemic data injustices could become or be adaptable into instruments of injustice. From comparative studies of States’ approaches, it is also evident that stakeholders could address some legal compliance challenges through non-legal means. Such means include strategically prioritizing devolution and organizational interventions for gender equality in assessment teams. Therefore, implementing principles of epistemic data justice for rural women in Africa requires implementers to look beyond the text of the risk-based approaches and the law on risk assessment regimes.

Conclusion

The risk assessment regimes in Kenya and Tanzania compare differently in how they use risk assessment regimes as tools for addressing epistemic data injustices. Overall, the domestic risk assessment frameworks fall short in addressing the risks of epistemic data injustices posed to rural women in Tanzania and Kenya. To overcome the identified challenges, the design, application, and enforcement of risk assessment regimes need to be retooled.

Image is from flickr.com

1 The Digital Transformation Strategy for Africa (2020-2030), p 16.

2 The Digital Transformation Strategy for Africa (2020-2030), p 4.

3 UN Women, ‘Technologies for Rural Women in Africa’ p 3. <https://africa.unwomen.org/sites/default/files/Field%20Office%20Africa/Attachments/Publications/2016/03/tech%20for%20rural%20women%20policy%20brief-web.pdf> accessed 16 April 2024.

4 Ibid, p 7.

5 Report of the Special Rapporteur on the Right to Privacy 2020, para 31(c) and (d); Jemimah Njuki et al, ‘A Qualitative Assessment of Gender and Irrigation Technology in Kenya and Tanzania’ (2014) 18(3) Gender, Technology and Development 303, 319.

6 <https://gdpr-info.eu/recitals/no-75/> ; <https://www.dataprotection.ie/en/organisations/know-your-obligations/risk-based-approach?> accessed 11 April 2024.

7 Christopher Kuner et al., ‘Risk Management in Data Protection’ (2015) 5(2) International Data Privacy Law, 95, 96.

8 Malabo Convention 2014, Arts 20, 21; The Digital Transformation Strategy For Africa (2020-2030), pp 2, 9.

9 Data Protection Act 2019, s 8(1)(j).

11 Godfrey Massay, ‘The Struggles for Land Rights by Rural Women in Sub-Saharan Africa: The Case of Tanzania’ (2020) 11(2) African Journal of Economic and Management Studies 271.

12 Tesh Dagne, ‘Embracing the Data Revolution for Development: A Data Justice Framework for Farm Data on the Context of African Indigenous Farmers’ (2021) 20 Journal of Law, Social Justice and Global Development 33 < file:///C:/Users/Stud.%20Hilfskraft/Downloads/SSRN-id3857393%20(1).pdf > accessed 11 April 2024.

13 Digital Transformation Strategy for Africa (2020 -2030), pp 4 and 15.

14 These barriers have already been discussed in the past blog article.

15 Data Protection Act 2019, s 31(1).

16 Tanzanian Personal Data Protection (Personal Data Collection Processing) Regulations 2023, Regulation 33.

17 Darius Kloza et al., ‘Towards a Method for Data Protection Impact Assessment: Making Sense of GDPR Requirements’ p 3 < https://cris.vub.be/ws/portalfiles/portal/48091346/dpialab_pb2019_1_final.pdf> accessed 16 April 2024.

18 See the composition of some assessors in Kenya at <https://www.romer.co.ke/#whoweare>; <https://sentinelafricaconsulting.com/team/> accessed 27 April 2024.

19 Doreen Kinja, Organizational Culture Change Needed to Address Gender Bias in Africa’s ICT Space’ (August 2022) <https://www.africa.co S Vyas-Doorgapersad, ‘Gender and Information and Communication Technology (ICT) in Southern Africa To Promote The Sustainable Development Goals (SDGs)’ 26(2) (2018) Administratio Publica 7, 11.m/organisational-culture-change-needed-to-address-gender-bias-in-africas-ict-space/> accessed 16 April 2024;

20 Data Protection Act 2019, part VIII, The Personal Data Protection (Personal Data Collection and Processing) Regulations, 2023, reg 36.

21 Tanzanian Personal Data Protection (Personal Data Collection Processing) Regulations, 2023, reg 36.

22 Gichuhi & 2 Other Data Protection Commissioner; Mathenge & Another (Interested Parties) (Judicial Review E028 of 2023) [2023] KEHC 17321 (KLR), para 36.

23 Data Protection (General) Regulations 2021, s 56(1).

24 African Charter on Human and Peoples’ Rights 1981, Art 1.

Leave a Comment

Your email address will not be published. Required fields are marked