At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
According to Uber, the developmental self-driving system relies on an attentive operator to intervene if the system fails to perform appropriately during testing. In addition, the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.
According to Uber, the developmental self-driving system relies on an attentive operator to intervene if the system fails to perform appropriately during testing. In addition, the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.
“If we send someone a catalog and say, ‘Congratulations on your first child!’ and they’ve never told us they’re pregnant, that’s going to make some people uncomfortable,” Pole told me. “We are very conservative about compliance with all privacy laws. But even if you’re following the law, you can do things where people get queasy.”
Jurassic Park
John Hammond: I don't think you're giving us our due credit. Our scientists have done things which nobody's ever done before...
Dr. Ian Malcolm: Yeah, yeah, but your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should.
We study fifteen months of human mobility data for one and a half million individuals and find that human mobility traces are highly unique. In fact, in a dataset where the location of an individual is specified hourly, and with a spatial resolution equal to that given by the carrier's antennas, four spatio-temporal points are enough to uniquely identify 95% of the individuals.
"We think this data is more available than people think. When you think about, for instance wi-fi or any application you start on your phone, we call up the same kind of mobility data."
😭
“Adults are so busy and focused on so much other than ethical issues that we don’t often stop to think coherently about what our moral principles really are. Or what we think of our own moral character. We just assume we’re good people and let it go at that."
https://www.brown.edu/academics/science-and-technology-studies/framework-making-ethical-decisions https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/#434722ae6668 https://www.engadget.com/2018/06/27/google-duplex-ai-phone-call/ https://www.nytimes.com/2012/02/19/magazine/shopping-habits.html?pagewanted=1&_r=1&hp http://www.bbc.co.uk/news/science-environment-21923360 http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html?pagewanted=1&_r=1&hp http://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf http://bigthink.com/design-for-good/4-tips-to-help-you-make-better-more-ethical-decisions https://www.bostonglobe.com/ideas/2018/03/22/computer-science-faces-ethics-crisis-the-cambridge-analytica-scandal-proves/IzaXxl2BsYBtwM4nxezgcP/story.html https://www.nature.com/articles/srep01376