This oral submission concentrated on two misconceptions that we see as underpinning the bill: that speech should never harm anyone, and that different rules should apply to speech online and offline.
We then discussed problems with the effectiveness of the bill - and how it might not be that useful for victims of digital harms but might be quite handy for people who want to suppress the views of others.
We believe that this Bill is based on false premises about the nature of freedom of expression and the differences between digital and non-digital speech. We see the Bill as being a well-meaning but misguided threat to the civil liberties of New Zealanders. We fear that the Bill will be ineffective in too many cases where it might be needed most, while being too effective in the cases which are most problematic to civil liberties.
We support the establishment of an agency to assist those harmed by harmful communications and believe that this will go a long way to resolving the types of situations that can be resolved.
We believe that the court proceedings are unfair and unlikely to be of much use. We support the discretion and guidelines given to the court in making a judgement, but believe that the procedures of the court need to better take into account the requirements for a fair trial.
The safe harbour provisions for online content hosts are unreasonable. While online content hosts do need protection from liability, the suggested mechanism amounts to a way that any person can get material taken down that they don’t like for any trivial reason. This section needs to be completely rethought in the context of overseas experiences to ensure that freedom of expression is properly protected.
The new offence of causing harm is poorly conceived and criminalises many communications that are of value to society. If not removed in its entirety, defences and an overriding Bill of Rights veto should be added.
We have also made comments on the changes to the Harassment and Crimes Acts.
The safe harbour provisions in the Harmful Digital Communications Bill are a serious threat to online freedom of speech in New Zealand.
How it works
Anyone can complain to an online content host (someone who has control over a website) that some material submitted by an external user on their site is unlawful, harmful or otherwise objectionable. The online content host must then make a choice:
- Remove the content and thereby qualify for immunity from civil or criminal action.
- Leave the content up and be exposed to civil or criminal liability.
The content host has to make its own determination about whether a piece of given content is unlawful (which may be very difficult when it comes to subjective issues such as defamation and impossible to determine when it concerns legal suppression), harmful or "otherwise objectionable".
Furthermore, there is:
- No oversight of the process from any judicial or other agency.
- No requirement for the content host to tell the person who originally posted the content that it has been deleted.
- No provision for any appeal by the content host or the person who originally posted the material.
- No penalty for people making false or unreasonable claims.
We can safely assume that most content hosts will tend to play it safe, especially if they're large corporates with risk-averse legal teams, and will take down material when requested. They have nothing to gain and plenty to lose by leaving complained about material online.
Serious ramifications for freedom of speech
Don't like what someone has said about you online? Send in a complaint and wait for it to be taken down.
This applies to comments on blogs, forums on auction sites, user-supplied content on news media sites, etc, etc. These are exactly the places where a lot of important speech occurs including discussions about politics and the issues of the day. The debates can often be heated, and some sites are well known for encouraging intemperate speech, but these discussions are becoming and increasingly important part of our national discourse.
This law will make it too easy for someone to stop arguing and start making complaints, thereby suppressing the freedom of expression of those they disagree with.
The jurisdiction problem
Of course, this will only apply to websites that are controlled by people who have a legal presence in New Zealand. Overseas websites will continue to maintain their own rules and ignore New Zealand law and standards of online behaviour.
As currently written, these safe harbour provisions are just a bad idea. They're too open to abuse and we believe they're more likely to be used to suppress acceptable speech than to eliminate harmful or "otherwise objectionable" speech. As a very minimum, the complaint should have to be approved by the Approved Agency referred to in the other parts of the Bill.
That said, the whole idea of removing "otherwise objectionable" speech is also quite worrying. The Harmful Digital Communications Bill already has an expansive set of rules about what sort of harmful speech shouldn't be allowed online and this "otherwise objectionable" seems to extend it even further. One of the principles we stand up for here is that civil liberties such as freedom of expression are as important online as they are offline, and this law goes far beyond anything in the offline world.
We hope to have more comment and analysis on other aspects of the Harmful Digital Communications Bill soon.
Please send any updates or other useful links and we'll incorporate them. Last updated: 10/9/2012.
- What's wrong with the Communications (New Media) Bill and can it be fixed?
- Law Commission - Harmful Digital Communications
- Powers of the Proposed Communications Tribunal
Lawyer Steven Price
Lawyer John Edwards
Stephen Bell at Computerworld
Mike O'Donnell from Trademe at Stuff
David Farrar at Kiwiblog
Chris Barton at NZ Herald
Richard Boock at Stuff
- Negotiation is the new black - the "Approved Agency"
- Trolls provide motivation for greater regulation
- Workshops on the Communications Bill are worth attending
Police Minister Judith Collins
- The creation of a new criminal offence that targets digital communications which are "grossly offensive or of an indecent, obscene or menacing character and which cause harm". Harm is said to include physical fear, humiliation, mental and emotional distress.
- The establishment of an Agency (i.e. Netsafe) that will be able to assist and advise people suffering from unpleasant digital communications.
- The establishment of a Communications Tribunal that will be able to respond to complaints and provide "speedy, efficient and cheap access to remedies such as takeown orders and cease & desist notices."
- Amendments to the Harassment Act, Human Rights Act, Privacy Act and Crimes Act to ensure that the provisions of these laws can be applied to digital communications.
- New requirements for NZ schools to work harder at stopping bullying of all kinds.
While sympathetic to the aims, we have some serious questions about the law and the thinking that lies behind it. This article discusses some of the problems that we see, talks about ways to resolve them and asks whether the problems are too great for some parts to be worth pursuing. We have arranged our arguments thematically and finish with our conclusions and recommendations.
Tech Liberty made a submission to the Media Regulation review run by the Law Commission. The summary of our submission is as follows:
We recognise that "big media" still has a lot of influence in New Zealand but that this influence is declining as the internet gives people the ability to:
- self-publish ("little media")
- share and distribute self-published articles
- publicly critique the work of big media.
This change can be seen in the way that online media such as blogs used to be very reactive to work published in newspapers and TV, but now newspapers and TV are increasingly picking up stories from blogs and other forms of social media.
Much of the rest of the review was about how the media should be regulated but we believe that the need for greater media regulation has not been established.
Defining news media
The review suggests that regulation could be a trade-off for official recognition of news media, and spends a lot of time discussing who would be included in the definition of "news media". We believe any definition would either be so broad as to be useless or so narrow that it would miss out many people and publications that arguably should be covered. This is especially true as journalism continues to develop and change in the internet age.
Special privileges for news media
The review suggests that we need a definition because some laws refer to the news media to bestow special privileges. Our preference is that these privileges should be extended to all citizens (e.g. replace the media "fair dealing" section in the Copyright Act with a more general "fair dealing/fair use" provision for all people) or should be available to all people when they are acting as a journalist.
Furthermore, any organisation that wish to include/exclude "news media" can make their own determinations as to who that is rather then relying on a government mandated definition.
We do not believe that there is a need for an external regulator. Indeed, as the internet gives people the means to publicly criticise the output of big media, the need for a regulator is reduced compared to the days when only a very limited number of media companies could get their views out (due to limited airwaves or the need to own a printing press).
Current regulation is also generally quite ineffectual. The original message still goes out and then any correction is ignored as the issue is no longer "news". Regulation tends to be after the fact score-keeping at best.
Any publishing company or journalist who wishes to be taken seriously has the ability to form a group and create their own code of ethics and regulator. The Press Council is an example of this and we do not see why other media groups who wish to be taken seriously could not do the same.
Finally, if there was a regulator our view was that it should be in the form of an Ombudsman with the ability to make morally rather than legally binding decisions.
Malicious speech online
The second part of the review was about harmful speech online.
We agreed that malicious speech online can be a problem just as it is when face to face Furthermore, the nature of the internet means that the malicious speech can both spread further and remain available longer.
We believe that the law is limited in what it can do about people being nasty to each other, either online or in person. Even if current law could deal with these issues, the international nature of the internet and the inevitable jurisdiction issues would mean that only a small proportion of problems could be resolved.
That said, many of the more contentious issues will be conducted by people who know each other well and probably even live in the same area. The law should be able to deal with issues of harassment using existing laws (possibly with the tweaks identified by the Commission to ensure that online communications are definitely covered).
We reject the idea that speech online should be held to a higher standard than any other form of speech.
We do support the creation of a new crime of "malicious online impersonation" with the caveat that it must be very careful not to include obvious cases of parody and other forms of non-serious impersonation.
No ISP responsibility
We oppose any attempt to make ISPs responsible for taking down or blocking information either hosted on their network or available through it. This is because ISPs typically have no visibility or control over the material that their customers might store on servers hosted with the ISP. Typically an ISP will only have one option - passing the request on to the publisher or turning off the entire site. Closing down an entire site would seem a gross over-reaction to the content of one offending post or comment.
It does seem appropriate to us that an ISP might have a responsibility to pass on a takedown message to the site owner (similar to the copyright legislation) or, upon presentation of a suitable court order, reveal the identity of the site owner so that legal action can be taken.
We recently wrote about how an offensive website was taken offline by complaints.
In particular, we talked about the tactics that were used to take them down and whether they were a good thing for the internet or not. The two tactics described were:
- Complaining to the ISP that the site breached their terms of service. We said this risks reducing opinion on the internet to the level of whatever a company's PR department finds acceptable.
- Using copyright complaints over the site's use of a photo without permission. Taking down an entire site over what is arguably a reasonable use of an image is an affront to freedom of speech and shows how dangerous these US-style shoot-first-ask-questions-later copyright laws are.
The article attracted a fair bit of comment both for and against the use of these tactics. We also received some new information and thought it was worth posting a followup.
This is a post about the tactics used to take down a New Zealand website hosted in the the USA and what they mean for the Internet. (Update post.)
Soon after the Christchurch quake, a website (christchurchquake.net) was published that said the quake was God's punishment for Christchurch's tolerance of homosexuality, with God being especially annoyed by Gay Ski Week. The website also made a number of other very odd claims concerning a conspiracy of "Phoenician-descended swamp lesbians" headed by Helen Clark that had taken over New Zealand.
The site is no longer available (Google cache here). This is because a number of people found the site highly offensive, and some of them decided that they would do what they could to get the site taken off the Internet.
The author of the site could not be identified so most action was aimed at getting Bluehost, a company based in the US state of Utah, to take it down. Two main tactics were employed:
If you've read our article about Sky's takedown notices you might be interested to see what a Sky takedown letter looks like, complete with the follow-up conversation.