[ad_1]
If you go to Tesla’s website to order a car today, you’ll find that a new upcoming feature for the brand’s Full Self-Driving Capability (a $7000 add-on promising autonomous capability in the future, but not now). Is it full self-driving? No. But the new addition, dubbed “Recognize and respond to traffic lights and stop signs” is a step toward FSD’s promised autonomous capabilities, even if it’s not yet been officially rolled out.
What does the still-beta-phase feature do? Just what you’d expect, given its name: It allows Teslas to “see” traffic signals and and stop signs—essentially, detect intersections—and respond accordingly. This was put on display on Out of Spec Motoring’s YouTube channel, which shared a video of a Tesla Model 3 automatically stopping for a red light with Autopilot engaged and also claiming it halted at stop sign. Arresting, for sure, but Tesla hadn’t—and still hasn’t—announced anything about this new FSD component rolling out to the fleet, which indicates it must only be available to a small group of Tesla users under a pilot program.
Tesla hacker, @greentheonly on Twitter, had also found a user’s manual for the feature buried in Firmware 2020.12.1 for Model 3’s and Y’s (and of course it’s a “beta” feature). The wording is carefully crafted, but here’s basically how the feature works:
The system uses forward-facing cameras, GPS, and map data to detect traffic lights and stop signs. As the car approaches an intersection with Autopilot engaged, you’re notified of an upcoming traffic light, and regardless of whether the light is green, yellow, or red, the car will slow down to a complete stop. A red line, displayed on the touchscreen, shows where the car has determined has to stop (it only stops if it’s showing a red line). If the approaching light is green and you deem it safe to keep going, you can override the deceleration by pressing down on the accelerator pedal or gear lever. If the light turns green after a complete stop, you do the same thing to give the car permission to proceed.
For now, the feature can only determine where and when to stop, but still depends on the driver for all other decisions. Also, we should point out, it is overly cautious—meaning the only way a Tesla might run a red light is by driver error, with Autopilot disengaged, or if the car doesn’t know it’s entering an intersection in the first place. As is Tesla’s common refrain, the use of Autopilot and other driver aids are fully the driver’s responsibility.
While in its current form, the intersection detection may seem a little wonky, it is designed for something more useful in the near term: Machine learning. In order to train the neural net, Tesla needs a massive number of real-world examples, and feedback about how humans responded to them. This explains why Tesla is having drivers make decision at green lights: They’ll harvest this real-world feedback to train the FSD neural net. (Have you ever been asked to pick images with traffic lights or buses from a bunch of photos before logging into a website? It’s the same principle; you’re providing data to train an image recognition system, and you might not even know it.) It isn’t the first time Tesla has leaned on real-world testing using customers. When Navigation on Autopilot first appeared, the system would suggest a lane change but required the driver’s confirmation. Months later, the system was updated to allow automatic lane changing without the confirmation.
Dealing with intersections is a tougher problem than highway lane-changing, and given that “Recognize and respond to traffic lights and stop signs” has just appeared in the small fleet of the Early Access program, it could be awhile before there’s a mass rollout. Also, we’ve yet to see what foreseeable (or unforeseeable) consequences the programming could have. Teslas stopping or slowing at green lights because their drivers aren’t paying attention (at least, enough to override the new system) could theoretically result in rear-end collisions in which inattentive drivers coming up behind might not anticipate a car slowing or stopping at a green light. Such an accident wouldn’t be the Tesla’s fault, technically (it’s that of the inattentive following car), but we only mention the possibility because the Tesla running this setup would be behaving counter to surrounding traffic and the traffic signal ahead.
The feature also encourages Autopilot use on surface streets—a new realm with variables not present on freeways, such as bicyclists, pedestrians, etc. As always, the driver is responsible for the intelligent use of Autopilot, so let’s hope we don’t see the kind of Autopilot misuse we’ve seen in the past crop up in busier in-town settings. Just remember, the new programming is a stepping stone to that future Full-Self Driving capability.
[ad_2]
Source link