Copy
the digest  July 2021

Two significant developments on emerging tech

In July, the UN Human Rights Council adopted a Resolution on new and emerging digital technologies.

Unlike the first iteration of the Resolution, adopted in 2019, this second iteration was not adopted unanimously. After the rejection of efforts from China to replace language referring to a “human rights-based approach” and “human rights-based laws and policies”, China, Venezuela and Eritrea all abstained on the vote. Whether this marks a one-off, or is part of a longer term trend away from the unanimous adoption of technology-focused Resolutions at the Human Rights Council, remains to be seen.

Alongside other civil society groups, GPD engaged closely in the drafting process for this Resolution, and we’re pleased to see the final text explicitly recognise the importance of “a human rights-based approach to new and emerging digital technologies” and of “ensuring appropriate safeguards and human oversight in the application of new and emerging digital technologies”. The text also includes welcome language around stakeholder inclusion, calling for “meaningful participation of all relevant stakeholders, including the private sector, academia and civil society”. In terms of next steps, the Resolution gives the Office of the United Nations High Commissioner for Human Rights a mandate to undertake expert consultations on the relationship between human rights and technical standards and the application of the UN Guiding Principles on Business and Human Rights to the activities of technology companies. We'll be following this work closely.

Elsewhere, July saw UNESCO publish the final version of its Recommendation on the Ethics of Artificial Intelligence. GPD has also been actively engaged in this process, having responded to an open consultation on an initial draft of the Recommendation in 2020 and commented on an updated draft earlier this year. The final text contains a number of positive changes, including the mainstreaming of human rights and additional detail on specific human rights concerns and high-risk AI systems, though some of our concerns about the text persist. We provide fuller thoughts here

For up-to-date information about opportunities for civil society engagement around AI, take a look at our AI Forums Guide and Events Calendar.

New online content laws in South East Asia: a concerning trend?

The last few months have seen a concerning wave of efforts by governments in South East Asia to restrict illegal and harmful content online:
  • In Vietnam, a new draft decree amending Decree No. 72 (which regulates online platforms) was published this month and is currently open for public consultation. It would expand government control over cross-border services, place strict limitations on live-streaming and expand the scope and types of services through which the government may limit legitimate expression—posing significant risks to freedom of expression, and intensifying an already repressive context.
  • In Malaysia, the government enacted the Emergency (Essential Powers) (No. 2) Ordinance 2021. This has been heavily criticised by civil society groups for its imposition of harsh prison sentences for the dissemination of broadly defined “fake news”, the wording of which could be used to restrict a wide range of legitimate speech, and silence criticism. In its requirement for companies and individuals to provide access to authorities conducting searches—including decryption codes—it also raises concerns around individuals' right to privacy.
  • Indonesia’s new MR5 Regulation—currently being implemented—makes it compulsory for all online platforms to register with the government and respond to takedown requests within 24 hours. However, there have been delays—all service providers were initially expected to register with the government by 24 May, but this deadline has been extended, providing an opening for a global coalition of civil society organisations to mobilise.
While different in form, these regulations share common traits. They are all overly broad in scope, giving authorities unfettered discretion to restrict expression in the pursuit of illegitimate aims with few meaningful safeguards; and they impose disproportionate sanctions on individuals and platforms. Resisting them—and the trend they represent—requires concerted action from both civil society and the private sector. 

After all, the online platforms being asked to comply with these repressive laws also have an obligation to respect human rights under international human rights law. While this may be a challenge, honouring this obligation means, at the very least, interpreting and complying with these laws in the most limited way possible. Platforms must also engage with civil society organisations and existing networks—both within the region and globally—to raise awareness of these troubling developments and support advocacy efforts.

Cyber norms at the UN: latest updates

A few updates from the key forums discussing norms for state behaviour in cyberspace:

  • At the Open Ended Working Group, discussions around modalities are still ongoing. Civil society access and participation remains controversial, despite most member states supporting a more inclusive and open approach (see Reaching Critical Will’s summary of the June meeting, pp. 3-4). We expect to hear more in the coming month, as member states will want to have agreed modalities before the busy UN General Assembly season kicks off in September.
  • It seems increasingly unlikely that there will be a UN resolution this year to set up the Cyber Programme of Action, spearheaded by France and Egypt. This isn’t necessarily bad news: pushing it to next year will give stakeholders—including civil society—to engage with co-sponsors to shape the aims, scope and modalities of the resolution.
  • The Chair of the Third Committee’s Ad-hoc Cybercrime Committee has started consulting with member states to elicit their high level views on the scope, objectives and agenda for its upcoming first meeting, which is provisionally set to take place 17-28 January 2022. The deadline for member state submissions is the end of September. We hope states take the opportunity to include civil society and other stakeholders.
Other news
  • As our Senior Programme Lead Sheetal Kumar observed in a recent article, states are becoming increasingly willing to attribute cyber attacks to other states. July furnished a good example, with the US, EU and NATO jointly condemning China for attacks on US servers. However, as was noted elsewhere, only the EU’s statement made reference to UN-agreed cyber norms—a missed opportunity for supporting the adoption and meaningful use of the responsible state behaviour framework.
  • Last month brought fresh revelations of states using NSO Group’s controversial Pegasus software to illegally spy on human rights activists, journalists and political opponents. These actions not only gravely contravene human rights; they also undermine trust and security in cyberspace. Along with more than 100 other civil society groups, we’ve signed onto a joint letter calling for a moratorium on the sale, transfer and use of surveillance technology, as well as the institution of a range of new regulatory measures. 
  • The first part of our capacity building series for civil society on UN discussions around responsible state behaviour in cyberspace wrapped up this month. You can watch recordings of the presentations on our dedicated UNGA hub. Applications for the second part will open in August, so stay tuned.

Listening post

Your monthly global update, tracking relevant laws and policies relating to the digital environment.

On the online content side, we saw several important developments:

  • In Armenia, a new bill on digital disinformation poses clear risks to freedom of speech and expression.
  • The Italian Senate is debating and reviewing expert contributions for Bill 2005, which would amend the Penal Code on provisions surrounding incitement to violence.
  • A newly passed Law on Protection from False and Inaccurate Information in Kyrgyzstan authorises the government to block certain types of information without a court order, in violation of the right to freedom of expression.
  • The government of Thailand announced Regulation No. 29, which empowers the authorities to censor online expression, and investigate and prosecute individuals responsible for communications that may “instigate fear”.
On trust and security, a few updates on laws relating to cybercrime and national cybersecurity strategies and policies: On the emerging tech front:
  • India’s Joint Committee of Parliament was given a fifth extension to submit its report on the Personal Data Protection Bill 2019 at the Monsoon session of Parliament, amid the government’s implication in the Pegasus malware scandal.
  • The British Virgin Islands’ Data Protection Act 2021 came into force.
  • Ireland published its National AI Strategy— ‘AI- Here for Good'—which includes strong language around human rights.
Copyright © 2019 Global Partners Digital.
All rights reserved

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.
 






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
Global Partners Digital · Second Home · 68 Hanbury St · London, E1 5JL · United Kingdom

Email Marketing Powered by Mailchimp