Submission (oral): Suppression in Criminal Reform Bill

Text of Tech Liberty’s oral submission to the Justice and Electoral Select Committee concerning name suppression in the Criminal Procedure (Reform and Modernisation) Bill. (See our earlier articles.)


I represent Tech Liberty, we’re a group dedicated to defending civil liberties in the digital age.

Our submission concentrates on the issues around suppression. It’s split into two parts, starting with some general comments about suppression in the Internet age, followed by a discussion of some of the difficulties with making internet service providers liable for the actions of their users.

General Points

As a civil liberties group, we wish to start by reiterating our support for open justice whenever possible. We believe that the overuse of suppression weakens our justice system and therefore we support the bill’s measures to reduce the availability of suppression. We’re also pleased to see better guidelines about when suppression is appropriate and how it is to be applied.

However, we’re concerned that we’re just rearranging the deck chairs on the Titanic, that suppression is rapidly going to get so hard to enforce that we’re going to have to give up on it – whether we want to or not. This because our ability to store and access information is changing so rapidly.

Identifying Information

Section 198 defines the term “identifying information” and includes “any other information that is likely to lead to the identification of that person”.

Suppression has always had the problem of three people each reporting a different fact and the combination of the three facts leading to successful identification. However, this problem is getting worse as the Internet increasing people’s access to information.

To illustrate by example, there was a recent case in Auckland where a 46 year old celebrity was granted interim name suppression. As an experiment I set out to find his name. A quick search and I found a page on Wikipedia that listed “well known New Zealanders born in 1964”, looked at the page for each likely suspect and found that there had been recent activity on Martin Devlin’s page. Total time – about 5 minutes from very minimal information.

As civil liberties campaigners we’ve talked about the privacy risks that occur when companies like Google collect and aggregate data from multiple sources and then let anyone search for it.

We’re left with the problem that nearly everything is identifying information when so much data is aggregated, cross-indexed and made available on the Internet.

Publishing information

Related to this, section 199 uses the terms publication and publish.

It is our understanding that suppression has always been aimed at the publishing of information and makes no attempt to stop private communications. This has worked well in the past, but this distinction is becoming less and less relevant.

Have I published information if I post it to Facebook? Does it make a difference if I post it so that only my family and 200 closest Facebook friends can read it?

These published private communications also persist for a lot longer. The chat down at the pub is quickly forgotten but Facebook posts, blog posts, messages on forums and Twitter updates last forever and often can be read by anyone in the world with an internet connection. And, of course, as we just mentioned, they’re searchable by Google and others.

International publishing

Finally, much of this data is published on overseas websites, out of the reach of New Zealand law. For example, I mentioned finding Martin Devlin’s name on Wikipedia.

Conclusion

The combination of these three trends, aggregation and searching, publishing of private communications and the offshoring of publishing, must surely make us all wonder whether our current suppression regime can be maintained.

While we do not oppose the idea of suppression and see that it has a place in New Zealand’s justice system, it may be better to start thinking about how we will have to adapt our system to a post-suppression world.

Internet Service Providers

We now turn our attention to section 216 concerning the liability of internet service providers. We strongly disagree with the idea that internet service providers should be held liable for material published by their customers.

ISPs should not be responsible for the actions of their customers

Firsly, most internet service providers do not have control over what information is published through their systems. Rather they provide a service that lets others publish information. Blaming the internet service provider makes about as much sense as making Telecom liable for a death threat made by a telephone call.

A much better target of the law would be the people who wrote and/or published the information – but existing law already covers this adequately. We recently saw that Cameron Slater was convicted for breaching name suppression orders on his Whaleoil blog.

By adding this responsibility to internet service providers, we are making them liable for the actions of others, something that we believe is abhorrent to natural justice. We are also increasing costs and undermining an industry that is already vulnerable to cheaper providers overseas.

Takedowns

Secondly, there is the very serious problem that internet providers have no way of knowing what information is suppressed.

Consider the case if someone contacts Orcon and says “One of your customers is hosting a website that publishes information in breach of a suppression order.” What should Orcon do?

There is no way that they can prove one way or another if the information is in breach. There is no central registry or place to call to find out. The published information may even be quite cryptic – just a name or a possibly a riddle. And if they get it wrong – they’re liable.

The only sensible reaction for Orcon is to take down the website as soon as they’re notified it is in breach – in effect, an accusation is enough to get a website taken offline.

This is contrary to provisions in the Bill of Rights Act including the right to be secure against unreasonable seizure and the right to be able to defend yourself when accused of a crime.

More importantly, it also undermines the right to freedom of speech. The hair-trigger nature of the requirements on ISPs mean that anyone can get a locally hosted website taken down – even if they’re lying and just doing it because they don’t like what the website says. We note that there is no penalty for making a false accusation.

You may want to think about how desirable this will be in the run up to the election as partisans try to get your political party or personal websites taken offline.

Technical Aspects of Hosting

Finally we want to talk about some of the technical aspects of hosting websites and why it’s not as simple as just saying that the ISP must “delete the material or prevent access to it”.

While the publisher or the person who runs the website can easily remove or hide a single piece of information, the people who provide the computer it’s running on or even just the internet connection do not have this sort of access.

By analogy, you could think of the internet service provider as a bank, with the individual sites counting as safe deposit boxes within the bank. While the bank can control access to the vault with the boxes, it does not hold a key to each box that would allow it to open it and remove a single item. Only the person who rents the box has that key.

The same holds true for ISPs and websites. While ISPs provide the basic services such as power and internet connection, they have no access into the working of the site. If an ISP receives notification that a forum post on a website run by one of their customers breaches a suppression order, they won’t have the necessary login and password to get into the site’s control panel to delete the offending post. Their only option will be to turn off the entire site.

Even worse than that, ISPs often host entire servers for their customers. Each server may contain multiple websites – and the ISP again has no access to their inner workings. If told that there is material on one of the websites that breaches suppression, their only option to avoid liability will be to block access to the server and thereby take all of the sites hosted on that server.

I suggest that this sort of disproportionate response is not what is intended by the Bill, but it’s the inevitable consequence of how the Bill is written combined with how internet services are provided.

Conclusion

In conclusion, we believe that section 216 is an offence to freedom of speech and the right to due process. It imposes an unfair burden on ISPs, forcing them to act as judge, jury and quick-draw executioner. It will lead to false accusations and improper censorship. It will lead to disproportionate responses as entire websites or groups of websites are taken down to remove one piece of information.

And, to what purpose? We already know that the Internet is not beyond the law. We have seen convictions of people for wilfully breaching suppression orders on a website.

We see no benefit in section 216 and no way that it can be made to work. We recommend that it be dropped from the Bill in its entirety.