You are on page 1of 5

Submission on the Harmful Digital Communications Bill

1. About This submission to the Justice and Electoral Committee on the Harmful Digital Communications Bill (the Bill) is made by the Internet Party. At the time of writing, the Internet Party is in the process of being established and registered with the Electoral Commission. A representative of the Internet Party wishes to appear before the Committee to speak to this submission. 2. Non-legislative Options Non-legislative options to address the majority of the harms caused by digital communications are preferable to the costs and unintended consequences of the Bill. We favour developing and introducing non-legislative options as a first step and legislating two to three years after such options have been introduced to address specific, identified areas where non-legislative options have been proven to be inadequate.

Neither the Bill nor non-legislative options will totally eliminate the harms caused by digital communications. Non-legislative options can effectively address 90% of the harms caused.1 As an example, Australia has introduced The Cooperative Arrangements for Complaints Handling on Social Networking Sites (the Protocol)2, which was developed through close cooperation with the industry and has been agreed to by Facebook, Google (YouTube), Yahoo! and Microsoft3. It is in the self-interest of the major social networks to avoid costs and unintended consequences imposed by legislation. The Government should proceed nonlegislatively with granting the Approved Agency with formal recognition and resources. This should be sufficient for the Approved Agency to be empowered to create a voluntary, collaborative framework suitable for New Zealand. Such a framework will also effectively tackle most issues surrounding the limitation of jurisdiction of New Zealand law in relation to the major overseas social networks. Legislation should be developed only after exhausting such non-legislative options to address specific, identified shortcomings. 3. Support for Some Provisions We support the Bills provisions related to creating a new offence of incitement to commit suicide, even in situations when a person does not attempt to take their own life, as well as amendments to the Harassment, Privacy and Human Rights Acts to ensure they are up-to-date for digital communications. 4. Safe Harbour Provisions Safe harbour provisions intended to shield content hosts for liability from content posted by third parties subject to their taking certain actions are welcome. However, we believe that such provisions will inevitably be abused by frivolous/vexatious complaints and those that dont meet the serious emotional distress threshold of the Bill as there are great risks and no upside for the hosts to question, delay, or challenge notices. We recommend that only a harmed complainant be able to lodge a notice of complaint with the content host, i.e. no compliant should be able to be lodged on behalf of the allegedly harmed person other than by the Approved Agency under

NetSafe figures for complaints involving people who know each other utilising digital communications on the major social networks. 2 Available at http://www.communications.gov.au/__data/assets/pdf_file/0004/160942/Cooperative_Arrangement_for_Co mplaints_Handling_on_Social_Networking_Sites.pdf 3 Individual self-declarations for complaints handling are available at http://www.communications.gov.au/funding_and_programs/cyber_safety

clause 20(4). Accordingly, clause 20(3) should be amended to require the notice to include such a certification and that the named person has been personally harmed. To give adequate weight to the notice under clause 20(3), the complainant should certify that the contents are true to the best of his/her knowledge. We further recommend a review mechanism for the content sought to be removed by the content host. A notice and notice regime is not favoured as it leads to additional costs for the content host. The recommended process is: (a) Content host on receipt of a valid notice under clause 20(3) removes the content as soon as is practicably possible, replacing the content with a notice Content removed and subject to review. Removing content in this manner minimises continuing harm, if any. (b) Content host is required to forward the notice, a copy of the content taken down, and any identifying information of the person posting it to the Approved Agency for review as soon as is practicable after the content is removed. The Approved Agency should be empowered to obtain personally identifying information from the Internet Service Provider of the content author if the content host does not have this information and can only provide the IP address and timestamp of the content author. (c) The Approved Agency notifies the content author and gives 48 hours for that person to challenge or otherwise explain the content. (d) The Approved Agency takes into account any response from the content author and decides whether or not the content should be removed under the provisions of the Bill. The content host is to be notified within 72 hours of receiving the referral. In accordance with the Approved Agencys decision, the content host either permanently removes the content or restores it. (e) The content host continues to benefit from the safe harbour provisions as long as it undertakes the steps above. (f) The decision of the Approved Agency can be challenged in a District Court. This process provides for a low cost imposition on the content host while not requiring it to make a judgement about the content. At the same time, it provides for automatic review by an independent body to minimise the period of time for which content is removed by frivolous/vexatious complaints. 5. Limit Approved Agency to Young Adults The Bill defines harm as serious emotional distress. The causes, nature, impact, and ability to cope with emotional distress varies between young adults and adults. Communications by adults often require more complex and more nuanced review, something that is perhaps better for the District Court. Both of these justify varying civil remedial measures available to young adults and adults.

We recommend that the Approved Agency only deal with complaints from young adults, defined as individuals aged up to 21 years old. Complaints from older individuals lie only with the District Court which is still expected to provide quick remedies under the provisions of the Bill. Adults will still be able to take steps under current laws and, in addition, criminal complaints (where there is an intent to cause harm and actual harm has arisen) against the worst of harmful digital communications under clause 19. It also specifically includes harm from intimate visual recordings. Adults will also still get increased protection from tightening the domestic affairs exemption under section 56 of the Privacy Act. 6. Delay Some Provisions The provisions related to bringing proceedings before a District Court (clause 10 to 18 and 22) and the new criminal offence provided by clause 19, in our opinion, provide for the least benefit under the Bill towards its objectives while accounting for the bulk of the unintended consequences of inhibiting free speech, especially amongst adults. Accordingly we recommend, in the event that our recommendation for nonlegislative options are not accepted and, further, that clauses 10 to 19 and 22 are enacted, that these clauses not come into force for a period of three years after the Bill comes into force and that a review of the operations of the Approved Agency is conducted to verify the continuing need for these provisions. For the same reasons as stated above and with the same comments about delay by three years, we oppose the amendment to the Crimes Act by clauses 23 and 24 at present. 7. Education and Publicity are Critical Cyber bullying is a sub-set of bullying. Enacting legislation to tackle bullying is at best one component of trying to change social norms. National Administration Guideline 5 requires Boards of Trustees of schools to provide a safe physical and emotional environment. While the Government intends to provide additional support to Boards of Trustees in addressing cyber bullying, we recommend that the Committee call for the Government to direct significantly more effort and resources to this area. Further, the Government needs to use its agencies and partners in a more collaborative manner to educate both young adults and adults about harmful digital communications. We would also like to draw the Committees attention to the proposed amendment to the Privacy Act by the Bill. Currently, most New Zealanders assume that anything in the public domain, including personal information, can be propagated further without thought or consequence. The amendment seeks to stop unreasonable or

unfair propagation of personal information already publicly available, contrary to widespread societal norms in this regard. Clause 34 of the Bill sets the threshold as unfair or unreasonable which is substantially and significantly different from that of harm (serious emotional distress) in the Bill generally. Accordingly, we recommend the Committee call for the Government to publicise the change and its implications widely as otherwise many people will be unaware of and caught out by the change to the Privacy Act.

You might also like