Click For Photo: https://regmedia.co.uk/2018/12/07/drone_protest4_images_via_shutterstock.jpg
AI is reported in extreme terms: it's revolutionising our roads, our workplaces and our homes - or it's stealing our jobs and will eradicate humanity. But what about operating in a war zone?
What is the rightful role of engineers in such circumstances? Can programmers whose code finds its way into systems involved in war or that robs citizens of their freedoms justifiably say "no" or absolve themselves by saying they were simply following orders? Do they have the power - or an obligation - to address the moral effects of what they build?
Googlers - Employer - Involvement - US - Defense
For 3,000 Googlers that proved worth doing in 2018 when they learned of their employer's involvement in the US Defense Department's Project Maven - using one of its TensorFlow APIs to help track and identify objects in videos taken by drones. Staffers told CEO Sundar Pichai in an open letter that Google "should not be in the business of war".
It worked and Google opted out of Maven, but with drones a firmly established battlefield tool and with some states actively seeking to field AI in combat, others will take Google's place.
Protesters - Territory - Decades - Firms - Intelligence
The protesters broke new territory. For decades, firms have supplied the military and intelligence services or unsavoury foreign powers without a shrug from coders building the gear, but 2018 saw a change as workers rose up:...
Wake Up To Breaking News!
Never under estimate the power of the people, especially when they are a stupid mass!