Technology regulation in New Zealand needs overhaul


Many of you might have watched Terminator movies when you were kid. For those deprived of what was essential viewing for my generation, they were about the remnants of humanity versus intelligent machines created by Skynet which posed a threat to the human race. These movies were science-fiction at its finest. But 30 years after the first one, killer robots are not so far fetched now as we thought.

It is not just killer robots – more on that later – but also the misuse of drones, which have many practical military and civilian uses, around airports and the rise of the sexual robot that have raised concerns. A mixture of security, ethical and safety issues have arisen at a speed that New Zealand politicians seem to have been caught flat footed.

New Zealand politicians have been slow to catch on to the growing threat for example posed by the use of drones and lasers around airports. Not a month goes by without drones and/or lasers being implicated in a potentially dangerous act that could have brought down an aircraft. A few weeks ago drones held up or forced the diversion of aircraft at Auckland Airport for over an hour. Other instances have included interference around Christchurch Airport by people with lasers.

Whilst progress is being made in tackling the interference of aircraft by people wielding lasers, this is not the case with drones. In the case of lasers, criminal prosecutions have been brought against several people, which has sent a message that this is criminal activity that can be traced.

Drones pose a bigger risk. They can be made to hover for long periods of time, move randomly with the pilot having no chance to react in time and their physical mass is large enough that it would cause substantial damage to a plane. Coupled with the restrictions placed on aircraft flight paths around airports, the potential to cause a major civil aviation incident is very real.

It is time to ask questions of how appropriate sexual robots are. These are predominantly female gendered robots that imitate sexual favours being performed. As robots have no concept of ethics, given the just alarm over sexual violence, how appropriate is it for a person to act out their fantasies on a robotic being that cannot say no or physically reject inappropriate conduct. Without appropriate checks on what sort of functions a robot can and cannot perform, is technology lending itself inadvertently to some of the darkest and most dangerous of control over a human being?

But the most dangerous robotic menace are potential murder drones or killer robots that might open fire or otherwise use lethal force against a human being. The artificial intelligence race means that robots with a degree of humanoid intelligence already exist. This is not just a concern of mine, but a concern of human rights organizations such as Amnesty International and Human Rights Watch. Numerous countries are already calling for a ban of such technology and point to the certainty that rogue states such as – but not limited to – North Korea might get hold of it and would be most certain to use it against rivals.

Overhauling regulation does not necessarily mean bringing in a raft of new laws, although that will definitely be necessary in dealing with some types of technology. It might well be that existing laws are fine, but just need updating. In the case of drones for example new regulations will be necessary, including licensing, fines and operating compliance with the Civil Aviation Authority rules.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.