We believe that this Bill is based on false premises about the nature of freedom of expression and the differences between digital and non-digital speech. We see the Bill as being a well-meaning but misguided threat to the civil liberties of New Zealanders. We fear that the Bill will be ineffective in too many cases where it might be needed most, while being too effective in the cases which are most problematic to civil liberties.
We support the establishment of an agency to assist those harmed by harmful communications and believe that this will go a long way to resolving the types of situations that can be resolved.
We believe that the court proceedings are unfair and unlikely to be of much use. We support the discretion and guidelines given to the court in making a judgement, but believe that the procedures of the court need to better take into account the requirements for a fair trial.
The safe harbour provisions for online content hosts are unreasonable. While online content hosts do need protection from liability, the suggested mechanism amounts to a way that any person can get material taken down that they don’t like for any trivial reason. This section needs to be completely rethought in the context of overseas experiences to ensure that freedom of expression is properly protected.
The new offence of causing harm is poorly conceived and criminalises many communications that are of value to society. If not removed in its entirety, defences and an overriding Bill of Rights veto should be added.
We have also made comments on the changes to the Harassment and Crimes Acts.
The Bill in context
While we sympathise with the aim of the bill to reduce digital harms, we have a serious disagreement with many of the ideas underpinning the bill:
- We do not believe that online and electronic communications should be held to a different and higher standard than offline speech.
- We do not believe that harming someone through a digital communication is somehow worse than a comparable level of harm done through another form of communication.
- We do not believe that it is desirable or possible to remove harm from communications whether online or offline.
We adopt the simple premise that the same human rights we have offline should apply online. But this Bill creates much greater limitations on rights online than are permissible offline. For this reason we strongly object to the Bill in principle. However, accepting that likelihood that this Bill will become law, we make detailed comments and recommendations in our submission below.
Freedom of expression and harm
Freedom of expression does not guarantee that speech will never cause harm. Indeed, one of the reasons freedom of expression is so important is because of the power of speech to reveal what was hidden, to influence people and to change what people think.
It is important to emphasise that expression, however “harmful”, is an act of speech – not a physical act causing physical harm. In general, our law penalises acts which cause physical harm, such as assault, manslaughter and murder, more harshly than acts of speech. This bill radically alters this approach, seeking to respond to new forms of emotional harm, namely those carried out using digital communications.
This Bill seems to take the view that harm, defined as serious emotional distress, is to be avoided wherever possible. That speech which causes such harm should be limited and controlled, with the speaker restrained and punished. Oddly, the Bill only applies this principle of “no serious emotional harm” to speech communicated electronically. For example, harmful speech must be limited and stopped if it is communicated by text message, radio waves, television or the internet. Harmful speech communicated by voice, newspaper, billboard or letter will have a completely separate set of rules with no agencies to help mediate, and no quick fire court action.
The absurdity of this distinction becomes plain when you consider the content of a “poison pen” note written on paper and slipped under the door. This Bill would not take any notice of the possible harm caused by this non-digital communication – unless someone then took a photo of it and emailed it.
We acknowledge that some argue that electronic speech is different because it can spread faster or can be more easily distributed. We do not deny that people use the internet and other digital services to be horrible to each other. But at the same time, we have lived with rumour, gossip, anonymous letters, scurrilous posters and the consequent harms for many years. This sort of cruelty, and the suffering it can lead to such as social ostracism or suicide, sadly seems to be inherent to being human. At the same time, the internet has empowered new ways to counter such harm, empower victims and for society to condemn perpetrators of such cruelty.
And it is exactly because electronic speech, with all its speed and ease of distribution, is so powerful that we must be particularly careful to preserve our ability to use it without unreasonable restraint. Political speech on the internet is not a second class citizen to that in the newspaper, and the internet is increasingly the most important place where we exercise our freedom of expression. This Bill, by treating online and offline differently sets a dangerous precedent.
Failure to consider global best practice and internet governance
The internet was developed and has evolved without specific laws. Yet the very conditions which created this incredible technology are now under threat by Bills such as this one. All over the world, many dozens of countries are legislating in an attempt to regulate online content. This Bill is no “world leader” in that regard: quite the contrary. Already countries as diverse as South Africa, the United Kingdom, India, Pakistan, Malaysia, the Philippines, Mexico, Colombia, the United States, Canada, Bosnia-Herzegovina, Russia, and many others have taken steps to regulate and attempt to control the expression of their citizens online.
Among these, New Zealand ranks highly for internet access and we pride ourselves for having a strong human rights ethos. Yet our online freedoms and tech liberties have a soft underbelly. For example, a recent study on internet freedom in New Zealand concluded that New Zealand meets barely half of the 29 indicators for minimum protection of human rights online under the La Rue Framework. The study concluded that New Zealand complies with 14 indicators, does not comply with 4, and compliance is unclear for the remaining 11. These surprising results show that New Zealand’s high rating for human rights and freedom of expression offline, is not reflected online. In our view the Bill, though crafted with admirable good intention, fails to properly balance and impose reasonable limitations on freedom of expression and, in doing so, will further reduce protection of rights online.
The Bill has been framed without any reference to the founding documents of the United Nations World Summit of the Information Society, the Geneva Declaration of Principles, and the Internet Governance Forum all of which are relevant to human rights and internet and regulation on online content. The failure to frame the Bill by reference to these wider international multi-stakeholder agreements and processes, which successive New Zealand governments have accepted and participated in, reveals the siloed nature of thinking about the Bill under New Zealand law.
We therefore recommend that the objectives of the Bill be amended to specifically “recognise the global and open nature of the Internet”.
Necessary and Proportionate – Remedies for Online Rights Violations
A new set of global principles for the protection of privacy online has recently been developed to assist lawmakers. While these have been drafted in relation to online surveillance, we think these are relevant and should also be taken into account by the Select Committee in considering this Bill.
In particular, the Necessary and Proportionate Principles explain how international human rights law applies in the current digital environment, particularly in light of the increase in and changes to communications surveillance technologies and techniques. These principles can provide civil society groups, industry, States and others with a framework to evaluate whether current or proposed surveillance laws and practices are consistent with human rights.The principles were the outcome of a global consultation with civil society groups, industry and international experts in communications surveillance law, policy and technology.
Violence Against Women
A significant rationale for the Bill (cited in Cabinet Papers and the Law Commission papers) is the need to address cyberbullying and violence against women. While we share concerns about these forms of harm, we have serious concerns that this Bill has been drafted without meaningful input from women’s organisations – indeed it appears that only 1 organisation, the National Council of Women, made a submission on the Law Commission’s original proposals. We question therefore the usefulness of this Bill for the groups it purports to protect: can this Bill really be effective for women, particularly those who are victims of violence online, if they have not been involved in the process of developing it?
Violence against women online is a serious issue. As Joanne Sadler has powerfully said
Many individuals and organisations working at the intersection of feminism, social justice, human rights, and information technology echo the observations in a recent report: “If we agree that the online world is socially constructed, then gender norms, stereotypes and inequality that exist offline… can be replicated online.” … But as we laud the power of the internet to inform, break through isolation, and unite, we must also recognise its power to misinform, divide and put individual and collective rights in direct conflict with one another.
Many governments are regulating in this field, but international studies reveal that many women’s organisations are extremely concerned at what governments are doing in the name of protecting women from harm. In many cases, regulation of online content is in fact being used to restrict women’s rights to freedom of expression and freedom of association, rather than to enhance their safety. For example, women’s human rights advocates report complaints about their online content related to women’s rights, such as material about abortion, sexual and reproductive health and rights, and protection from domestic violence, being complained about for political reasons or on the grounds that provision of such information is harmful to others, such as young people.
However, information and communications technologies like the internet and mobile phones are a double-edged sword – they can be used by abusers to deepen their control and by survivors of violence to connect to help and by women’s rights defenders to inform, denounce and strategise to end violence. For these reasons, many women’s rights advocates seek less, rather than more regulation.
The Bill fails to adequately take into account these complexities. For example, that the provisions for complaint and take down will be open to mischievous and vexatious complaints about so-called “harmful” online content, particularly to try and stifle or take down content which is politically contentious but legitimate, expression. Information about abortion and sexual and reproductive health information is a good example.
It is critical that the Bill does not become a vehicle for limiting legitimate expression in these ways. Nor should it be allowed to use gender-based hate speech directed at women to effectively silence women’s free speech rights by unfairly permitting unrestrained speech such as revenge pornography (currently banned by the Crimes Act). A balance needs to be struck, but we do not think this balance has been achieved, not least because neither the the Law Commission nor the Ministry of Justice have adequately consulted with women’s rights organisations nor specialists in women’s rights online.
Finally, and in addition, the principles fail to take account of the life cycle of online activity which can quickly harness collective action to combat an abuser or harasser in ways which empower victims. For example, we note that many of the statements against the RoastBuster gang would be defined as harmful communications by this Bill. We therefore recommend that the principles be amended to better empower the victim to use remedies and which do not penalise them for doing so.
Jurisdiction and effectiveness
The Law Commission report correctly notes that jurisdictional difficulties will make it impossible to resolve some significant percentage of harmful digital communications online. Overseas websites and communications services will be under no obligation to respond to orders from the Approved Agency or District Court.
We can fully expect that people who maliciously abuse others for “fun” (aka trolls) will quickly learn to use overseas services that provide sufficient anonymity, limiting the ability of any New Zealand law to limit that particular type of harmful communication.
However, there are also other serious problems with the effectiveness of the Bill’s measures.
Speed and silence
The first major problem is that in many cases the harm will already have been done before the District Court gets involved. In any scenario, the situation first has to get bad enough that the complainant looks for a solution. Then they have to make an initial complaint to the Agency who will try and assist them to resolve it. Finally, if that does not help, it will be escalated to the District Court. The chances are that in a significant number of cases that the harm will have already been done or the situation will be resolved.
As said by the victim in one of the more horrible stories in the Law Commission’s report, “in less than a day after it happened, I started getting more abuse at school”. At that point the information was out, people were already acting cruelly in person and the problem was far beyond the scope of any order the District Court could make no matter how fast they moved.
Secondly, a major recurring element in the stories describing the harms done by this type of bullying is that the victim suffers in silence without telling anyone. In these cases it is obvious that the Agency and Court cannot help – we need education and a change in society’s attitudes.
The complaints regime – Principles, Agency and District Court
The 10 Principles
A major part of the Bill concerns the creation of principles and rules about what types of language are acceptable in electronic communications. These principles and rules go significantly beyond current New Zealand law.
The Bill sets up 10 communication principles to be used by the Agency and Court, although they state them as universal truths of the form “A communication should not…”. While some of these are derived from existing law (e.g. harassment, threats), some of them are new such as that a communication should not be grossly offensive or indecent.
There are potential problems with some of them. For example, the statement in principle 6 that a communication should not make a false allegation should surely read a “knowingly false allegation”, and when it says in principle 7 that a communication should not contain a matter that is published in breach of confidence, what does that mean for whistleblowing and leaks?
Principle 10 expands the realms of unacceptable denigration: “A communication should not denigrate a person by reason of his or her colour, race, ethnic or national origins, religion, ethical belief, gender, sexual orientation, or disability.” The Human Rights Act currently forbids the denigration of people on the grounds of colour, race, or ethnic or national origins. Principle 10 obviously expands this significantly but, interestingly, it omits some of the other obvious categories from the Human Rights Act such as age, marital status, employment status and family status. A particular problem is the ground of “ethical belief” – this category is vague and largely untested in New Zealand law.
The Law Commission’s report gives no reasoning for this expansion of unacceptable denigration, nor does it explain why it chose some grounds but not others. It does say that the Human Rights Act should catch up but this seems to be putting the cart before the horse. The reality therefore, is that rather than encapsulating existing law, this principle would restrict a far greater range of expression online.
More than 40 governments (including the US, UK, and Australia) have recently affirmed:
“… that the same rights that people have offline must also be protected online, in particular freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice, in accordance with articles 19 of the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights”
The question has to be where did these principles come from? The Law Commission’s report says they’re a bit of a grab bag from existing criminal and civil law.
More importantly, are they a set of principles that we can all agree on? Shouldn’t there be some more public discussion around them? If these principles are so obviously correct why should they only apply to electronic communications?
We fear that these principles will be picked up and used in other future legislation, legislation that might have less discretion as to how they are applied than this Bill does. We worry that they are poorly thought out and not based properly in New Zealand law and tradition. We are concerned that many communications that are of of value to our society would be unreasonably caught by these principles. We are especially concerned that as framed these principles restrict our freedoms online far more than our freedoms offline.
Accordingly, we believe that these principles need to be seriously rethought and rewritten to ensure that they are fit for purpose, inline with New Zealand law and values, and sufficiently protective of our right to freedom of expression.
We also suggest that there might be value in including some aspects of positive communications. These could communications that are:
- Satire, parody or done in humour.
- Express an honestly held opinion.
- Contribute to a discussion.
Including these and others would give the Agency and the Court additional tools to help them in their decision making.
The Approved Agency
The creation of the Agency with the ability to advise, guide and assist people in dealing with digital harms is the one part of the Bill that we strongly agree with. We see that this would be of significant benefit and would go a long way to resolving most of the problems with harmful digital communications that can be resolved.
We hope that the Agency will be funded sufficiently to take on what will probably turn out to be a rather arduous task.
We recommend that the Agency should be charged with collecting statistics about the types of problems that it encounters and how they are resolved (or not). This will be helpful in future policy development and law making.
The District Court
It is our belief that the Agency will provide the most benefit in those cases that can be resolved and we have some doubts about whether the District Court procedures will actually be that effective.
Our preference is for the introduction of this part of the system to either be removed from the bill, or the introduction postponed until there has been time to see how well the Agency works.
We approve of:
- The recommended use of technical advisers to assist in certain cases.
- The discretion given to the court, the guidelines in 17(4) and the explicit mention of the Bill of Rights Act in 17(5).
However we have concerns around:
- The lack of due process with the Court including the lack of procedures, the acceptance of otherwise inadmissible evidence and the ability to make orders without the defendant being able to defend themselves.
- The ability to order an insincere and meaningless apology.
- Rewrite the principles to better reflect current New Zealand law and social standards.
- Add positive communication principles as an additional guide to the Agency and Court.
- The Agency should be required to collect and publish statistics about its work.
- Consider delaying the introduction of the District Court proceedings until such time as the Approved Agency has been running for a year or two and we have a better idea of what is required.
- Giving more guidance to the structure of the District Court proceedings with a view to better protecting the rights of both the complainant and defendant.
- Include a required review of this law two years after it comes into effect.
- Remove the ability to order an insincere and meaningless apology.
New offence of causing harm – section 19
We are not reflexively opposed to the idea of an offence for deliberately causing harm via digital communications (although once again we are not certain why it is only limited to the digital world). While freedom of expression is important and should not be unduly limited, it seems possible to us that there are cases when deliberately causing serious emotional distress should be illegal.
However, this new offence is actually very widely defined and may well capture many communications that are of immense value to society.
In addition, it contravenes international law insofar as it effectively criminalises defamation. The UN Special Rapporteur has specifically said that criminal defamation violates freedom of expression and is an unnecessary and disproportionate measure under international human rights standards. In many countries similar provisions are used politically to suppress speech, particularly during election periods.
Let’s consider the case where someone takes a photo of a politician receiving a bribe and, shocked at their corruption, posts that photo to the internet. This communication would easily match the three requirements of the proposed law:
- be posted with the intention of harming the victim (the prospect of facing criminal charges or being obliged to resign could be assumed to cause the victim distress).
- would cause harm to any reasonable person in the position of the victim (any reasonable person would not like having evidence of their criminal corruption exposed to the world).
- could be easily proved to have caused harm (serious emotional distress) to the victim.
The person would then face a penalty of up to 3 months in jail and $2000.
As shown in our example above (and it’s trivial to think of many other examples) this offence will catch many communications which are of value to society. There are no available defences such as that the communication may be in the public interest, counts as fair comment, or exposes criminal wrongdoing. Truth is explicitly removed as a defence in section 19(4)(a). There is no mention of freedom of expression and the NZ Bill of Rights Act.
Indeed, the barrier here seems far lower than that specified for the lighter touch of the complaints regime even though the penalties are more serious.
We are greatly concerned that this law will have a seriously detrimental effect on freedom of expression and public discourse in New Zealand. How will our journalists and citizen journalists be able to expose wrong doing when broadcasting it on electronic media such as the internet, radio or TV is a criminal act if it hurts the wrong-doer’s feelings?
This law as written would never be acceptable if it applied to speech in a newspaper, we believe that it is as completely unacceptable when applied online.
Intimate Visual Recording
We fail to see the benefit of including the definition of an intimate visual recording (largely copied from 216G of the Crimes Act) and explicitly defining intimate visual recordings as coming under this section. Communicating an intimate visual recording in an attempt to harm someone would clearly already be covered by this section without this addition.
- Remove this section entirely until such time it can be rewritten in a technology neutral way to capture all unacceptably harmful communications, not just those done digitally.
- Improve the balance between freedom of expression and stopping harm. Provide defences such as those for defamation and acknowledge that some harmful communications have significant societal value.
- Provide an explicit Bill of Rights check.
- Remove the definition of ‘intimate visual recording’ from 19(4) and remove explicit mention of this from the definition of ‘posts a digital communication’.
Internet content hosts safe harbour – section 20
While we agree that safe harbour provisions can be a useful part of internet law, the proposed safe harbour in section 20 has a number of significant flaws and fails to strike the right balance between protection and civil liberties.
Don’t like what someone has said about you online? Send in a complaint and wait for it to be taken down. Or get hundreds of your friends to all send in complaints and swamp the online content host.
This applies to comments on blogs, forums on auction sites, user-supplied content on news media sites, etc, etc. These are exactly the places where a lot of important speech occurs including discussions about politics and the issues of the day. The debates can often be heated, and some sites are well known for encouraging intemperate speech, but these discussions are becoming an increasingly important part of our national discourse.
This law will make it too easy for someone to stop arguing and start making complaints, thereby suppressing the freedom of expression of those they disagree with.
Need for protection
We do need protection for online content hosts. The current system where they can be held liable for information posted by others is in itself a serious limit on freedom of expression – it’s hard to allow a forum for free-wheeling debate if you fear being sued over it. NZ’s current defamation law is seen as a particular problem and we suggest that it might be worth attacking the problem at the source by reviewing the Defamation Act.
However, trading off protection for online content hosts in exchange for a speech-suppressing trigger-happy takedown system is not the right answer. While this serves the needs of the big content providers, it fails to serve the wider needs of freedom of expression in New Zealand.
Anyone can complain
Anyone can complain to an online content host (someone who has control over a website) that some material submitted by an external user on their site is unlawful, harmful or otherwise objectionable. The online content host must then make a choice:
- Remove the content and thereby qualify for immunity from civil or criminal action.
- Leave the content up and be exposed to civil or criminal liability.
Accordingly, the content host has to decide whether to:
- Expend resources on making its own determination about whether a piece of given content is unlawful (which may be very difficult when it comes to subjective issues such as defamation and impossible to determine when it concerns legal suppression), harmful or “otherwise objectionable”, and put itself in the firing line should it make the wrong call; or,
- Expend almost no resources and take the content down.
We can safely assume that most content hosts will tend to play it safe: Large corporates with risk-averse legal teams are not noted for their defence of freedom of expression, and small personal or group blogs do not have the legal knowledge to make a sound judgement, nor the money to defend themselves if they get it wrong. Both have nothing to gain and plenty to lose by leaving complained about material online.
Furthermore, there is:
- No oversight of the process from any judicial or other agency.
- No requirement for the content host to tell the person who originally posted the content that it has been deleted.
- No provision for any appeal by the content host or the person who originally posted the material.
- There is no onus on the complainant to be acting in good faith or to be telling the truth.
- The complainant doesn’t have to have been affected by the speech they’re complaining about.
- There is no penalty for people making false or unreasonable claims.
- There is no barrier to making multiple claims or getting large numbers of people to all make claims. (We confidently expect that someone or some group will set up automated systems to flood targets with hundreds or thousands of claims a day.)
As currently written, these safe harbour provisions are a bad idea. It seems that they’ve been rushed into this bill without sufficient thought or consultation. They don’t take into account the experience of safe harbour laws in other jurisdictions. They’re too open to abuse and we believe they’re more likely to be used to suppress acceptable speech than to eliminate harmful or “otherwise objectionable” speech.
The issues around safe harbour provisions are complex and still evolving. We believe that this section should be removed until such time as it can be rethought and significantly redrafted after consultation and further research. These recommendations should be read as input to such a process:
- Complaints should only be accepted from the targets or people directly affected.
- The system must be designed so that online content hosts don’t have to make difficult ethical and legal decisions with the chance of serious punishment if they get it wrong.
- The law should make clear that an online content host is not liable for any content when they, in good faith, decline a complaint.
- Due process demands that, where possible, the original poster of the material must be notified and have an opportunity to challenge the takedown notice.
- Removed content should not silently vanish but should be replaced by a placeholder explaining that it has been removed. This will allow participants in a discussion to realise that censorship is taking place.
- Abuse of the process needs to be discouraged. One suggestion has been that people making a complaint would need to swear an oath that they are telling the truth at risk of some penalty if this is then shown to be untrue.
- Content hosts should have the right to discard automated or repeated takedowns without losing the safe harbour protection.
- The importance of the right to freedom of expression in the NZ Bill of Rights should be reiterated.
- Solve the source of part of the problem by reviewing the Defamation Act.
Finally, it is important that any system or process put in place must be accessible and easy for everyone involved: the complainants, the original posters, and the online content hosts that must handle the process. It would be terrible if the final outcome was that more and more content hosts just turned off user-submitted content as they couldn’t handle the complaints process.
Updates to Other Acts
Crimes Act – sections 23 & 24
The change to make inciting, counselling or procuring suicide a crime even when the target does not attempt suicide appears to be an attempt to control a particular type of bullying.
However, convicting someone of a crime that can be punished by up to 3 years in prison for something as trivial as responding “go kill yourself” in a heated argument does not seem reasonable. We suggest that this type of bullying would be better handled under the new complaints regime detailed elsewhere in this bill.
- Remove sections 23 and 24, or alternatively, limit it to communications made to those under 18 years of age.
Harassment Act – section 26
We support the clarification of current law to show that harassment also includes acts performed via electronic communications.
The other change, saying that a single act can count as harassment if it sustained over time is more problematic, particularly when applied to digital communications. We acknowledge that a single communication can have both reach and staying power when available on the internet. However it seems unreasonable to define something as harassment where the original communicator may have no control over how it is kept and shared on the internet.
For example, if someone posts a video to the internet and then, after watching it spread further than they expected, has second thoughts and takes it down a few hours later, the chances are that additional copies have already been made that can be shared again and again by others. Counting this as deliberate harassment would seem unjust.
- Limit the scope of section 26 to only apply a) where the accused harasser has posted material in such a way as to deliberately harass someone, and b) where the accused harasser could remove the material or has deliberately limited their ability to remove the material.