国产热热热精品,亚洲视频久久】日韩,三级婷婷在线久久,99人妻精品视频,精品九热人人肉肉在线,AV东京热一区二区,91po在线视频观看,久久激情宗合,青青草黄色手机视频

Global EditionASIA 中文雙語(yǔ)Fran?ais
Opinion
Home / Opinion / Opinion Line

AI and weapons might not be the smartest move

China Daily | Updated: 2018-06-12 07:24
Share
Share - WeChat
The Google logo is seen at the Young Entrepreneurs fair in Paris, France, February 7, 2018. [Photo/VCG]

ON THURSDAY, Google released its manifesto of principles guiding Artificial Intelligence, which states it will not support the use of AI for weaponized systems. Thepaper.cn commented on Monday:

It was under the pressure of its staff and a public outcry that Google released the manifesto, after it was reported that Google had signed a contract with the US military, according to which it would provide the military with its Tensor Flow API interface for machine learning.

That sparked criticism both within and outside Google, with many people worried that it might assist in threatening human lives. Reports even show that about 4,000 staff members wrote letters to oppose it.

Google's release of its AI principles may have pacified people. However, it raises the question: How do we prevent AI from posing a threat to humans? Science fiction writers have been asking this question in their works for a long time, and many of them have expressed worries about robots killing people.

In 2012, when US troops were reported to use intelligent unmanned aerial vehicles in the battlefield, that aroused deeper worries. Some said that when UAVs have intelligence, that means a machine will have the power to decide to kill a human.

In order to solve this problem, many have contributed their wisdom. Isaac Asimov, in his novel I, Robot listed three laws of robotics so that smart machines would not hurt humans.

However, these principles depend on AI, not humans. A more effective way is to prevent robots and AI from controlling weapons, so that they will never get the chance to kill humans.

With the progress of deep learning algorithms, AI will become increasingly more independent from humans. If they are given control over weapons, the day when they could decide to kill humans might come. It is better to prevent it from happening at the very beginning and strictly limit AIs from controlling weapons.

Most Viewed in 24 Hours
Top
BACK TO THE TOP
English
Copyright 1994 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
License for publishing multimedia online 0108263

Registration Number: 130349
FOLLOW US
涿鹿县| 滦南县| 镇赉县| 漠河县| 扎赉特旗| 临颍县| 连城县| 张家口市| 泾源县| 安岳县| 平邑县| 武强县| 西昌市| 绥芬河市| 泰来县| 社会| 兴海县| 台中市| 定陶县| 隆化县| 金阳县| 江西省| 昭通市| 绥滨县| 安岳县| 阿尔山市| 德钦县| 阿拉善左旗| 新巴尔虎左旗| 墨脱县| 东乌珠穆沁旗| 镇安县| 四平市| 云浮市| 平陆县| 新野县| 威信县| 长白| 集贤县| 峡江县| 百色市|