Ensuring a Beneficial and Safe Digital Environment for Groups with Special Requirements

This issue pertains to the unique challenges that arise as children, senior citizens, persons with disabilities, and other vulnerable groups increasingly rely on cloud services (specifically software as a service, or SaaS) for important activities. For all of these groups, cloud services can ease access to essential services and other economic and social opportunities that may previously have depended on mobility, affluence, and other privileges (for example, distance learning, healthcare, and communication). However, like other online services, the prevalence of cloud services can also expose them to serious risks (for example, storage and distribution of child sexual abuse material, online scams, and other predatory behavior). To ensure these groups can fully benefit from the digital transformation, cloud services must be designed and governed with their unique requirements in mind.

Key Considerations

  • Current protections are insufficient. Current measures to protect vulnerable groups online inadequately address the scope of potential and realized harm as they increasingly turn to cloud-hosted services for critical services (such as telehealth and local government services).
  • Lagging attention to senior citizens, persons with disabilities, and other vulnerable groups online. Governments, international organizations, and industry have given less attention to ensuring the digital environment works for and protects senior citizens and persons with disabilities than to ensuring children’s digital rights, although they seem broadly supportive of doing so.
  • Trade-offs between privacy and security in protecting children. Protecting children from harm and exploitation may involve trade-offs between ensuring children’s safety and the privacy of users’ cloud-hosted data because preventing and investigating those harms often requires law enforcement access to cloud-hosted data.
  • Uncertainty regarding allocation of responsibility and liability. Cloud providers and their enterprise clients may disagree about who should be responsible for incorporating into their offerings measures that protect and facilitate access for vulnerable groups.
  • Cultural differences exist. Cultural differences in attitudes toward vulnerable groups’ digital rights may complicate implementation of these practices in certain communities.

Stakeholder Perspectives

  • Generally, seek to use cloud services to fulfill and protect children’s rights,1 including:
    • providing education, government-issued identity, privacy, and access to media information via the cloud; and
    • protecting children from harm including abuse, exploitation, and trauma while using cloud services.

 

  • Want to ensure they are not directly or indirectly (via their enterprise customers) facilitating harm or exploitation of vulnerable groups. (Similar to customers’ perspective.)
  • Want to avoid being held liable for indirect harm to vulnerable groups that may arise from their operations or those of their customers. (Similar to customers’ perspective.)
  • Seek to win contracts for services that provide for vulnerable groups fundamental rights such as education, identity, and so on.
  • Enterprise customers: Want to ensure that they are not directly or indirectly (via their operations) facilitating harm or exploitation of vulnerable groups.2 (Similar to cloud providers’ perspective.)
  • Enterprise customers: Want to avoid being held liable for indirect harms to vulnerable groups that may arise from their operations or those of their customers or end users. (Similar to cloud providers’ perspective.)
  • International organizations: Seek to build agreement and encourage member entities to protect vulnerable groups in the digital environment.

Tensions With Other Cloud Governance Issues

Potential Ways Ahead

  • Promote standards and potentially offer incentives to ensure digital services are accessible and inclusive.4
  • Prioritize enforcement of existing laws designed to protect vulnerable groups online.
  • Differentiate and clearly define standards regarding how content harmful to children, seniors, and others is moderated, and by whom.

 

  • Establish clear policies and robust technical mechanisms that prevent, discourage, and penalize customers from utilizing cloud services for exploitation of vulnerable groups.
  • Ensure consumer-facing enterprise customers, such as social media platforms, offer built-in protections for children and other vulnerable groups on their platforms.5 (Shared with enterprise customers.)
  • Ensure that combatting exploitation minimally compromises user privacy or data encryption.
  • Consider providing additional services at little or no cost to low-income individual users and certain emerging enterprises (for example, minority or woman-owned businesses).
  • Enterprise customers: Establish clear policies that ensure enterprise operations do not (whether directly or indirectly) cause or facilitate harm or exploitation of vulnerable groups.
  • Enterprise customers: Build technical protections for vulnerable groups into platforms (such as limits on recommendation algorithms, and special readable text for the differently-abled).6 (Shared with cloud providers.)
  • Consumer customers: Ensure that children are engaging in safe digital environments by monitoring and appropriately curating content, using privacy controls and other restrictions, researching platforms, and so on.
  • International organizations: Spearhead or expand current initiatives that ensure a safe and beneficial digital environment for vulnerable groups.7

Recent Examples

Notes

1 United Nations “Convention on the Rights of the Child,” United Nations, November 20, 1989, https://www.ohchr.org/en/professionalinterest/pages/crc.aspx.

2 There exist international treaties and domestic laws in many countries prohibiting exploitation, such as child sexual abuse material (CSAM). Many major enterprise customers of cloud services, particularly social media platforms, peer-to-peer messaging services, and cloud storage services have publicly stated their commitments to ensuring their services are not used to share and store CSAM.

3 Adi Robertson, “Apple’s controversial new child protection features explained,” The Verge, August 10, 2021, https://www.theverge.com/2021/8/10/22613225/apple-csam-scanning-messages-child-safety-features-privacy-controversy-explained.

4 Web Accessibility Initiative, “Web Content Accessibility Guidelines (WCAG) Overview,” Web Accessibility Initiative, (July 2005) April 29, 2021, https://www.w3.org/WAI/standards-guidelines/wcag/.

5 Sarah Perez, “TikTok to add more privacy protections for teenaged users, limit push notifications,” TechCrunch, August 12, 2021, https://techcrunch.com/2021/08/12/tiktok-to-add-more-privacy-protections-for-teenaged-users-limit-push-notifications/.

6 Sarah Perez, “TikTok to add more privacy protections for teenaged users, limit push notifications,” TechCrunch, August 12, 2021, https://techcrunch.com/2021/08/12/tiktok-to-add-more-privacy-protections-for-teenaged-users-limit-push-notifications/.

7 “What we do,” UNICEF, n.d., https://www.unicef.org/what-we-do.