ADVFN Logo ADVFN

We could not find any results for:
Make sure your spelling is correct or try broadening your search.

Trending Now

Toplists

It looks like you aren't logged in.
Click the button below to log in and view your recent history.

Hot Features

Registration Strip Icon for charts Register for streaming realtime charts, analysis tools, and prices.

SEE Seeing Machines Limited

4.39
0.185 (4.40%)
26 Apr 2024 - Closed
Delayed by 15 minutes
Share Name Share Symbol Market Type Share ISIN Share Description
Seeing Machines Limited LSE:SEE London Ordinary Share AU0000XINAJ0 ORD NPV (DI)
  Price Change % Change Share Price Bid Price Offer Price High Price Low Price Open Price Shares Traded Last Trade
  0.185 4.40% 4.39 4.30 4.34 4.505 4.185 4.30 7,555,243 16:35:25
Industry Sector Turnover Profit EPS - Basic PE Ratio Market Cap
Computer Related Svcs, Nec 57.77M -15.55M -0.0037 -11.62 178.71M
Seeing Machines Limited is listed in the Computer Related Svcs sector of the London Stock Exchange with ticker SEE. The last closing price for Seeing Machines was 4.21p. Over the last year, Seeing Machines shares have traded in a share price range of 3.985p to 6.15p.

Seeing Machines currently has 4,156,019,000 shares in issue. The market capitalisation of Seeing Machines is £178.71 million. Seeing Machines has a price to earnings ratio (PE ratio) of -11.62.

Seeing Machines Share Discussion Threads

Showing 16901 to 16919 of 21850 messages
Chat Pages: Latest  682  681  680  679  678  677  676  675  674  673  672  671  Older
DateSubjectAuthorDiscuss
05/9/2019
18:14
It is difficult when the share price of a company we have invested in fails to respond to good new but we must remember that we are still viewed as a highly speculative Blue Sky company who have had a habit recently of disappointing The Markets eg the inadequately explained fleet disaster least year,beak even announced to be much later than expected,regular changes @ Board level,a bigger & more deeply discounted placing than expected ,anticipated contract wins yet to be announced,eg rail & aviation-apart from being HQd in Canberra,where most Aussies would rather not live.
So it should not be such a surprise that our share price continues to languish -however the next few weeks & months could well transpire to be truly transformational for us -if we can win the remainder of the RFQs + Toyota ,the Aviation deal referred to by PM in his July interview, 1 or 2 licencing deals securing our financial position & ,of course, the continued rehabilitation of Fleet which will hopefully be now growing ahead of expectations.
Often darkest just before dawn & any sign of our share price slipping back towards 3p will have me taking my stake to higher level again.
Hopefully my optimism will be rewarded .

base7
05/9/2019
16:09
Hmm... not what I expected at all
rjcdc
05/9/2019
08:31
Feds scold Tesla for slow response on driver monitoring
In 2017, NTSB called steering wheel torque a "poor surrogate" for driver attention.
TIMOTHY B. LEE - 9/4/2019, 10:50 PM

The National Transportation Safety Board, a federal agency tasked with investigating transportation crashes, published a preliminary report Tuesday about a January 2018 crash in Culver City, California. For the most part, the report confirmed what we already knew about the incident: a Tesla Model S with Autopilot engaged crashed into a fire truck at 31 miles per hour. Thankfully, no one was seriously injured.

But near the end of its report, NTSB called Tesla out for failing to respond to a 2017 recommendation to improve its driver monitoring system.

Tesla's cars measure driver engagement by using a torque sensor to check whether the driver's hands are on the steering wheel. The NTSB criticized this approach in its 2017 report. The Culver City crash illustrates the point: before the crash, Autopilot issued four visual alerts and one audio alert over the course of a 13-minute trip. Yet the driver admitted he didn't see the firetruck.

Tesla says it has made significant improvements to its system since then.

"Since this incident occurred, we have made updates to our system including adjusting the time intervals between hands-on warnings and the conditions under which they're activated," a Tesla spokeswoman said in an email statement to Ars Technica. She touted Tesla's overall safety record and said that "Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored."

Yet the NTSB says it's still waiting for Tesla to explain its current technology and how it will improve over time. According to the agency, other carmakers have submitted responses to the agency describing their plans in this area. But not Tesla.

The NTSB's job is to investigate crashes, not write regulations. So snubbing the NTSB may not have any immediate consequences for Tesla. Still, the NTSB's pointed criticism will increase pressure on Tesla to rethink its approach to driver monitoring. The agency notes that it is still investigating two recent fatal crashes involving Autopilot—one in California in March 2018 and another in Florida in March 2019.

A steering wheel torque sensor may not be enough
After Tesla's first fatal crash involving Autopilot in 2016, the NTSB recommended that carmakers improve the way Autopilot monitors drivers to make sure they pay attention to the road. The NTSB suggested that Tesla's torque-based approach wasn't up to the job.

"Because driving is an inherently visual task, a driver might touch the steering wheel without assessing the roadway, traffic conditions, or vehicle performance," the agency wrote in 2017. As a result, "monitoring steering wheel torque provides a poor surrogate means of determining a driver's degree of engagement with the driving task."

FURTHER READING
Why emergency braking systems sometimes hit parked cars and lane dividers
In its 2017 report about the deadly 2016 Autopilot crash, the NTSB recommended that Tesla and other automakers with similar driver-assistance systems "develop applications to more effectively sense the driver's level of engagement."

The NTSB didn't endorse any particular technology for doing this, but one approach that has been gaining popularity in the industry is to use a camera to monitor the driver's face and determine where the driver is looking. If the driver's eyes are off the road for more than a few seconds, the car warns the driver and eventually brings the car to a stop. Cadillac's Super Cruise, for example, takes this approach.

In this week's report, NTSB says it addressed its 2017 recommendation to several carmakers: Tesla, Volkswagen, BMW, Nissan, Mercedes Benz, and Volvo.

"All manufacturers except Tesla have responded to the NTSB explaining their current systems and their efforts to reduce misuse and keep drivers engaged, including consideration for improving driver monitoring techniques," the agency writes. In contrast, the agency says it's still waiting for a response from Tesla.

mirabeau
05/9/2019
08:14
09.04.2019 06:29 PM

Tesla has reduced the time before drivers are warned to reapply pressure to the steering wheel, but the changes may not be adequate for investigators.

JASPER JUNIEN/GETTY IMAGES

The design of Tesla’s Autopilot feature contributed to a January 2018 accident in which a Model S sedan smashed into the back of a fire truck in Southern California, according to federal safety investigators. It is the second time the National Transportation Safety Board has found Tesla partially responsible for a crash involving the semiautomated feature. The federal board says it’s also investigating two other Autopilot-involved crashes.

No one was hurt in the 2018 crash, but investigators found that the driver had flipped on Autopilot about 14 minutes before the crash and had not actively steered for the final 13 minutes. Investigators said the driver’s inattention and overreliance on Autopilot were probable causes of the crash. During those final 14 minutes, the car warned the driver to apply pressure to the steering wheel four times, but he did not apply pressure in the roughly four minutes before the crash, investigators found.

Investigators said the driver’s use of Autopilot was “in ways inconsistent” with Tesla’s guidance. The driver said he learned how to use Autopilot from a Tesla salesperson but did not read the owner’s manual, which tells drivers exactly when and where they should use Autopilot.


The incident emphasizes what industry watchdogs and even Tesla itself have said before: Autopilot isn’t a self-driving technology. It requires drivers’ attention, even when the road ahead looks like smooth sailing.

But investigators also seem to believe that Tesla isn’t doing enough to make Autopilot safe. In its report, the NTSB highlighted a recommendation following another Autopilot-involved crash that killed a Florida driver in 2016. The panel asked automakers to “develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking” when using “automated vehicle control systems.” Tesla has changed how Autopilot works, requiring drivers to put pressure on the wheel more frequently while the feature is engaged. But the NTSB seems to believe it’s not enough.

“Fool me once, shame on you; fool me twice, shame on me. Fool me four, five, or six times now—that’s too much,” says David Friedman, former acting head of the National Highway Traffic Safety Administration and now director of advocacy at Consumer Reports. “If Tesla doesn’t fix Autopilot, then [the federal government] should do it for them.” (The NTSB can only recommend safety improvements; the NHTSA can enact regulations.)

Tesla said in a statement that “Tesla drivers have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance. While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored, we’ve also introduced numerous updates to make our safeguards smarter, safer, and more effective across every hardware platform we’ve deployed. Since this incident occurred, we have made updates to our system, including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated.”

The vehicle in the 2018 crash, in Culver City, California, was a 2014 model. Tesla has since revamped the hardware—the front-facing cameras and radar, the ultrasonic sensors—in its vehicles. (CEO Elon Musk has famously said that today’s Teslas have all the hardware they need to drive themselves. The electric automaker is still working on the software part.)

Investigators examined the driver’s cell phone after the crash and found that he was neither texting nor on a phone call before the incident. But the report warns that the NTSB can’t tell whether he was playing with an app on his phone. (He told investigators he wasn’t.) A witness who had been driving next to the car before it collided with the fire truck said the driver appeared to be looking down at something in his left hand.

Then there was the bagel and coffee. The driver said those foods were in the car with him and that he believed there was food next to him during the crash. But the coffee spilled and the bagel was smashed, so he couldn't be sure, and they may have been in his hand.

The incident also highlights what many have criticized about Tesla’s approach to Autopilot, and about some automakers’ semiautomated strategies, which rely on humans to monitor advanced driving features. Since at least World War II, researchers have known that humans are garbage at monitoring technology that is near-perfect and can't be trusted to react when something goes wrong. The British Royal Air Force found people monitoring radar sometimes missed the blips on their screens that indicated German subs; the self-driving-vehicle developer Waymo reportedly discovered that drivers charged with monitoring its tech from behind the wheel sometimes fell asleep.

“Humans are really bad at watching paint dry, and that’s what you’re asking them to do if the car can do a lot of the functions itself,” says Friedman.

General Motors, which calls its semiautomated feature SuperCruise, takes a different approach. Rather than relying on steering wheel torque, like Tesla, the Detroit carmaker has installed cameras inside the vehicle to monitor drivers’ eyes and ensure they’re watching the road. Tesla executives reportedly considered and then rejected that approach to driver monitoring.

Musk has acknowledged that finding the balance in semiautomated tech is difficult. “When there is a serious accident it is almost always—in fact maybe always—the case that it is an experienced user, and the issue is more one of complacency,” he has said.

But Musk has also said that publicizing Tesla crashes can kill more people in the long run by encouraging drivers not to use Autopilot, which he believes enhances safety. “It's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe, because people might actually turn it off and then die,” he said last year.

Musk promised that Tesla would publish quarterly reports on Autopilot safety, but thus far those reports have come in the form of a short sentence in each earnings report, with no in-depth data to back it up. This week’s NTSB finding will likely put pressure on the carmaker to defend its signature tech breakthrough. And, perhaps, not a moment too soon: Musk has said Tesla could have 1 million totally self-driving vehicles on the road by next year—though the company has yet to demonstrate the tech.

mirabeau
02/9/2019
22:18
Team 5 would be decent start
base7
02/9/2019
22:03
No chance of a low ball offer. Too many shareholders over 10% who will block it
blackpudding13
02/9/2019
08:27
Ps. I think team300 is pushing it a little but no reason why team50 isn’t possible within 12 months....
rjcdc
02/9/2019
08:11
Nice little teaser to start the month off....
rjcdc
02/9/2019
07:14
Well September has started with a nice booster RNS
zero the hero
01/9/2019
22:31
Lol... I’d rather do the calculation on the other £22k of SEE shares I have...!

September is here... lots of excitement to come.

rjcdc
30/8/2019
16:16
I added a pathetic amount of 20k shares at 4.02.. so my thinking is that most of the sells are actually buys.

Roll on September. I’m sticking to my 6p target by month end...

rjcdc
30/8/2019
09:23
RNS to tell us what base7 has already told us :-)

Seeing Machines Limited (AIM: SEE, "Seeing Machines" or the "Company"), the advanced computer vision technology company that designs AI-powered operator monitoring systems to improve transport safety, confrms that it will release its audited results for the year ended 30 June 2019 on Monday, 23 September 2019.

The Company also confirms that it will be hosting a Capital Markets Day in London on Wednesday, 6 November 2019. To register for this event, please email Sophie.Nicoll@SeeingMachines.com and more details will be provided in due course.

zero the hero
29/8/2019
13:46
Quarterly newsletter released to registered investors ( ie registered via SEE website)
Finals -23/9
Capital Markets day (investors can attend)-6/11
AGM -28/11

base7
28/8/2019
18:17
I wish.... only the former...Altho I would like to get to 1m shares by Xmas.... but that would require some positive movement before then
rjcdc
28/8/2019
17:24
Is that 30,000 more shares or £30,000 more ?
base7
28/8/2019
16:04
Getting close to adding another 30k... September is going to be a good month
rjcdc
27/8/2019
06:28
What it’s like to work as a safety driver in a self-driving car

An Optimus Ride employee sits in the driver seat as he monitors the Optimus Ride autonomous six-seater shuttle bus as it drives through the Brooklyn Navy Yard on August 15, 2019 in New York City.

27 August 2019 • 6:00am

At a junction in downtown Detroit, a self-driving car stops beside a pedestrian waiting to cross.

Without touching the car, an electric shuttle operated by startup May Mobility, its safety driver Andrew Dykman confidently waves the man across the road.

“I have a pretty good idea of what the car is going to do at any time. So I’m letting him know that it's waiting for him. I guess it just comes from experience,” he explains.

When a car in Uber’s self-driving testing fleet struck and killed a woman in Arizona last year, images of the safety driver were beamed around the world. The woman, Rafaela Vasquez, appears to be looking down before gasping in shock and grabbing the wheel as the pedestrian, Elaine Herzberg, is hit. A police report concluded that she had been watching The Voice on her phone.

The incident put the spotlight on a mysterious and brand-new line of work. Sitting in the driver’s seat and keeping an eye on a self-driving car in training wasn't anyone's job ten years ago (apart from perhaps a handful of Google engineers). Now, there are hundreds, potentially thousands of them worldwide.

Companies send their partially-developed cars out in public to collect data and to examine how they react to real-life situations. Since none of them are yet fully autonomous, they need a human babysitter.

But what is that job like? Put simply, the role involves sitting in the driver’s seat of a semi-autonomous car and making sure it doesn’t veer off the road, hit anything, or otherwise go rogue. Sounds simple. But it can also be exhausting, lonely, and potentially very dangerous.

“We all know that it's a difficult thing to [do] monotonous driving for eight hours, even when you're doing the driving. So imagine that now, your task is to maintain complete awareness, while someone else is driving. And that is a very difficult thing for any human being,” says Eric Paul Dennis, of independent non-profit the Centre for Automotive Research.

“There is a higher cognitive load doing this than driving all the time, because you're thinking about what the car [will] do,” says Dykman, who is May Mobility’s technical programme manager.

A car which can safely deal with all situations doesn’t yet exist, and its human supervisor is still responsible for everything it does. This means the safety driver has to pay close attention to what is happening around them, while not actually doing anything for long periods of time.

Of course, piloting a cutting-edge semi-autonomous car for hours each day on public roads was always going to be a risky job. But Uber’s accident brought the risks into sharper relief.

Uber spokeswoman Sarah Abboud says the company has focused on reducing driver fatigue, telling drivers that they can come back to base at any time if they feel tired or if something is wrong with the car, and always having two drivers in the car (the driver who crashed was alone at the time).

It also keeps a closer eye on its safety drivers. Companies which detect distracted driving by looking at body position and eye gaze are increasingly selling their products to self-driving car companies. “We have a driver monitoring system, so we're actually able in real time to know if, say for example, a safety driver behind the wheel is looking down for an extended period of time or not paying attention to the road,” she says.

Public transport company Transdev, which is developing self-driving shuttles, has advertised for a self-driving vehicle operator in Phoenix, Arizona, paid $20 per hour, who can "operate a vehicle 6-8 hours a day alone, five days a week; able to sit still for long periods of time."

The company said it uses cameras to detect driver attention levels and encourages them to ask for someone else to take over if they are tired, without fear of repercussion.

"It is important to note that fatigue and distracted driving are not new problems. As a transport operator, Transdev has always needed to do all we can to make sure our drivers are focused and alert," it said in a statement.

As members of the public begin to take rides in self-driving cars, safety drivers are also one of the few human faces of this new technology, so they’re also taught how to answer curious questions from the public, something almost all drivers have dealt with at some point. "I've seen people in Audi R8s drive by us and look at our cars, and I'm like - 'you're looking at me?'" says Dykman.

According to Uber, people with military backgrounds make particularly great safety drivers. Veterans, Abboud says, make “have proven to be some of our really robust, well-adjusted, mission specialists” (the company’s term for safety drivers), because they are used to following structured plans and detailed instructions.

Smaller startups have teams with engineering or technical knowledge, who supervise the cars as well as detecting or fixing problems. Ned Boykov is an experienced mechanical engineer, originally from Bulgaria, who has worked for Silicon Valley self-driving startup AiMotive for just over a year as a test and integration engineer.

“I kind of feel like the car is something I'm connected to. I already know the system and know the limitations," he says. "You have to be a very responsible person to to be able to do this job. Not only a good driver, because you can be a good driver, but not paying much attention to everything surrounding you for a quick part of the second, and something can go wrong."

Many of May Mobility’s "fleet attendants" are local students with an interest in the technology, and it also likes to hire veterans and retired engineers. “Those are the ideal people that we found, people who are intelligent and motivated, and excited about what we're doing,” says Dykman.

An expired job advert for Waymo, which began life as Google's self-driving car project, stipulates that candidates must be 21 years old, have no more than one point on their license, have no drunk or drug-driving convictions, or ever been at fault for an accident that caused injury or death.

Joseph Rooney, chief executive of Denver-based firm Elevation Proving Grounds, which sources safety drivers for both car companies and lorry companies, says asking for a clean driving record is common. "They need to be safety-oriented people, because it's a very critical position to autonomous vehicles. If things go wrong it takes a dent on the company [and] on the industry."

Many of the people he recruits to test self-driving lorries are experienced long-distance HGV drivers excited at the chance to go home every night, instead of spending weeks on the road, while applicants to work in cars are often students or recent graduates who are looking for a way into the industry, or desk-bound career-changers tired of the "daily grind".

"Some people, they do have a college education, and maybe their college education isn't relevant to AVs (autonomous vehicles), so they use the driver position as a way to break into the industry," he says.

Jesse Clifton worked in retail before becoming a vehicle operator for Uber. He has now spent two and a half years behind the wheel of self-driving cars, originally at the taxi app firm in Pittsburgh, and now at Silicon Valley startup Voyage, where he works as the company’s dispatch lead, organising the cars and the testing process.

“I don't even think I get that much of an adrenaline rush anymore,” he says. “The first few times, like in Pittsburgh, if something happened your heart would really start racing, you’d pull over for a few minutes and calm down. But after experiencing it so much - you take over if you need to, and you move through those scenarios safely, and then you just continue on.”

His worst fear? Other (human) drivers. “I was honestly more worried about somebody rear-ending me, versus somebody appearing in front of me, or a car running a red light. I was more afraid of the distracted people behind me.”

mirabeau
25/8/2019
13:04
Looking a bit better again here.
hazl
25/8/2019
11:35
thanks mate
mkeene
Chat Pages: Latest  682  681  680  679  678  677  676  675  674  673  672  671  Older

Your Recent History

Delayed Upgrade Clock