NOTIS:

Terima kasih kepada semua pengundi.......

Israel's use of AI in Gaza raises concerns about civilian casualties and the future of warfare

Report: Israel's use of AI in Gaza raises concerns about civilian casualties and the future of warfare

• A recent report by +972 Magazine alleges that the Israeli army used an artificial intelligence (AI) system called Lavender to generate lists of tens of thousands of human targets for potential airstrikes in Gaza, leading to a large number of civilian casualties.

• The report, based on interviews with six unnamed sources in Israeli intelligence, claims that Lavender was used in conjunction with other AI systems to target and assassinate suspected militants, often in their own homes, without proper human oversight.

• The Israeli Defence Force denies many of the claims, stating that it does not use an AI system to identify terrorist operatives and that Lavender is merely a database for cross-referencing intelligence sources.

• However, previous reports and statements suggest that Israel has been actively developing and using AI systems for military purposes, including a "first AI war" against Hamas in 2021 and the use of another AI system called Habsora to identify potential militant buildings and facilities for bombing.

• The report also highlights concerns about the accuracy and reliability of these AI systems, with one intelligence officer claiming that Lavender made errors in approximately 10% of cases, leading to the bombing of targets in their homes without hesitation, resulting in civilian casualties.

• The increasing use of military AI raises ethical, moral, and legal concerns, as there are currently no clear, universally accepted, or legally binding rules governing the development and deployment of such systems.

No comments

Post a Comment

© all rights reserved
made with by templateszoo