By continuing to use this site, you agree to our use of cookies. Find out more
Forum sponsored by:
Forum sponsored by Forum House Ad Zone

Electric Cars

All Topics | Latest Posts

Search for:  in Thread Title in  
bricky13/07/2019 20:02:49
627 forum posts
72 photos

Where is all the power going to come from,are we going to have forests of wind turbines,and are not batteries made from harmful materials that will be expensive to recycle or dispose of.I have no interlectual knowledge of these questions and am sure someone will provide answers.

Frank

Vic13/07/2019 20:23:24
3453 forum posts
23 photos

“Not to underestimate the difficulties, but I think autonomous vehicles are clearly possible today. Whether they're legal on the public highway or not depends only on our attitude to risk. I suggest autonomous vehicles should be accepted as soon as they achieve accident rates lower than humans. As computers don't get tired, drunk, go racing, show-off, misjudge distances, and aren't distracted by leggy blondes it may be sooner than expected!”

Here you go Dave.

**LINK**

pgk pgk13/07/2019 21:00:04
2661 forum posts
294 photos
Posted by Barnaby Wilde on 13/07/2019 19:06:13:

I would dearly love to rip your whole post apart, maybe over a pint or two. I love folk that are at least capable of thinking about it.

In your scenario, the machine is able to learn from its mistake. In my scenario, & the whole point of my post, is that the machine has just been destroyed because it caught a live grenade.

The exception to the rule is that the machine can be taught to do anything . . . Except for an exception to those rules. If it truly has been programmed to learn from its mistakes then how can it possibly learn from a fatal mistake?

True AI is a load of bollox & it will never happen.

If you're buying...laugh.

As for being able to learn from fatal mistakes - easy. The AI in my Tesla example sends recent data back whenever the driver over-rides it's current level of autonomy. It also keeps longer term logs and it's rare than any destruction will destroy all data sources- indeed why there are black boxes on aircraft so that mistakes can be learned from. Again in the case of Tesla they have about 1/2 million cars out there now logging billions of miles and having to build ever larger/aster computer systems to sift that data.

the only difference between human intuition and intelligence and AI is that at the moment we can analyse multiple sources of input and cross-reference, ignore, extrapolate and guess much faster than machines but I don't see anything that can't be duplicated in the (near) future apart from original thought (a rare concept in people)

Mike Poole13/07/2019 21:39:50
avatar
3676 forum posts
82 photos

The best thing about autonomous vehicles is that you will be able to give the leggy blonde your full attention, at least until the wife gives you a poke in the ribs.

Mike

SillyOldDuffer13/07/2019 22:06:09
10668 forum posts
2415 photos
Posted by Barnaby Wilde on 13/07/2019 18:06:17:
Posted by pgk pgk on 13/07/2019 14:58:14:
Posted by terry callaghan on 13/07/2019 12:19:00:

...

...

Fully autonomous vehicles can only exist if the possibility of "exception handling" is removed from the equation.

Back when I was young & beautiful I gave a very famous lecture to a bunch of egg headed computer nerds many of which went on to create what is now the present state of technology.

I invited a random member of the audience onto stage & we threw a tennis ball to each other. The math, the technology involved in having a robot play 'catch the ball' is phenomenal. Many people misunderstand just how difficult it would be to create such a robot, but it can be done.

I'm trying to explain what is involved as we toss the ball back & forth. Then I substitute the ball for a hand grenade & confirm that the random member is OK with tossing a live grenade

We tossed that grenade back & forth a few times before my finale . . . . Where I explain that given everything that is involved so far you can make a robot to accomplish this task.

Then I pull the pin from the grenade, let the handle fly away to stage left, count to four & toss it.

Have a good long think about what constitutes an exception to the rule, then have a good long think about sharing the road with a vehicle that cannot possibly ever comprehend one.

First, exception handling is well established in computer programming - many programming languages provide a mechanism whereby unexpected events can be handled. Not necessarily complicated, in an autonomous car a good strategy for dealing with the unexpected would usually be 'STOP'.

Secondly, if autonomous cars are forbidden by your logic then so are CNC machine tools because they cannot deal with exceptions like the operator falling under the cutter. Actually, with reasonable precautions, most of us are happy to take the risk.

Thirdly, although humans are good at learning from experience, it can't be assumed people will deal with an exception better than a machine. Most fast human reactions are reflexes rather than considered actions; you cannot learn to drive by studying the Highway Code, you have to train your primitive brain to perform stops, gear changes, and steering etc. Our actions are at least semi-automatic. An extended childhood is necessary for this and we never stop learning. What doesn't kill us makes us stronger. However, the human process of learning by repetition can be emulated by a robot, or pre-programmed, and it may be good enough. Aircraft are capable of flying to a destination and landing without the pilot touching the controls.

People rarely have to decide ethical questions when driving, but if a autonomous vehicle had to deal with a multi-choice exception it could apply the principle of least harm. It might get the answer wrong but so do people - all the time. A poor response is possible whenever taken by surprise by a novel situation such as someone throwing a grenade at you. Pretty dangerous whatever happens, maybe it's safer not to catch it, maybe the correct response is save others by altruistically falling on it. Attacking the thrower is also a valid response to your demonstration - don't try it at Heathrow unless you want to attract gunfire.

There's a reason you weren't duffed up. It's because you didn't throw a live grenade. So there's no rule and no exception to the rule. It was a fake. The demonstration is faulty for another reason, even had it been a real grenade both machine and human are faced with a range of possibilities with no correct solution. There certainly isn't an answer that humans always get right and machines always get wrong. Airport guards are trained to shoot terrorists and a machine could be programmed to do the same. Both are capable of blundering into tragedy.

Nice bit of theatre, but your trick proved nothing. I wonder how many of the egg head nerds were fooled?

Dave

PS counting to four with a real grenade would have given you a tricky exception to deal with. Most modern grenades explode two or three seconds after the lever is released. It's done to stop the other guy throwing it back. Grenades are designed to maim whatever the recipient does.

smiley

martin perman13/07/2019 22:43:22
avatar
2095 forum posts
75 photos

Based on this nobody has an original thought as we are programmed, taught or what ever by our parents and betters. Everything we do or say has originated from somewhere else in the past.

Is it possible to have an original thought?

The answer is yes and no. Yes, there are original ways to express thoughts, ideas, concepts and philosophies, but no, the actual subject upon which these thought, ideas, concepts and philosophies are based on, are not original. ... There's no such thing as originality, just authenticity.

Martin P

Michael Gilligan13/07/2019 23:30:27
avatar
23121 forum posts
1360 photos

For no particularly good reason, I will share a real event which happened this morning:

Picture a set of traffic lights, with two clearly defined lanes on both approach and exit ...

The exit lanes merge into one after a reasonably short distance [the implied rule being that the two lines of vehicles will take their places alternately].

As I approached the lights, there were two stationary vehicles [a car, followed by a van] in the nearside lane, and the outside lane was clear. ... so, I occupied the vacant position in the outside lane.

The lights changed, and I held-back for a moment to let the car go first.

I then realised that the van was tailgating the car and effectively [intentionally?] blocking my merge.

My [necessarily split-second] decision was to pass both vehicles, to avoid the oncoming traffic.

The manouvre was completed without undue fuss, but clearly irritated the car driver !!

... I would have liked to apologise and explain; but we rarely have that opportunity.

So: What would the autonomous vehicle have done, in my situation ?

.

MichaelG.

dcosta14/07/2019 00:53:00
496 forum posts
207 photos

Hi,
Autonomous vehicle?
See **HERE** please.

Link: https://www.translatetheweb.com/?from=pt&to=en&refd=www.translatetheweb.com&dl=en&rr=UC&a=https%3a%2f%2fwww.portugalms.com%2fcascais-ja-tem-um-veiculo-autonomo-mas-quer-ter-doze%2f

Regards
Dias Costa

Bill Phinn14/07/2019 04:43:22
1076 forum posts
129 photos
Posted by Barnaby Wilde on 14/07/2019 02:57:40:

Whilst we are straying from the OP slightly I do think that it's still all interconnected.

Complex problems require complex solutions. Will the AI that is capable of driving a car along a busy highway also be able to solve a marriage breakdown?

Sounds silly doesn't it, but if you take a complex problem like driving along a busy highway & compare that to a complex problem like a marriage breakdown then anything that can miraculously solve 1 should easily be able to tackle the other.

When AI can give coherent and unplagiarised A grade answers to essay questions on literary topics such as those I faced in A Level French:

"Discuss Flaubert's use of symbolism and irony in Madame Bovary".

"In Flaubert's Madame Bovary, how did Charles Bovary's personality persuade Emma to marry him and contribute to her downfall?"

"In what ways is Madame Bovary a realistic novel?"

then I will be prepared to change my present view: that there are significant limits to the ways in which AI can, and forseeably will be able to, substitute for or even merely help human beings.

Perhaps, though, you were being ironic.

pgk pgk14/07/2019 06:01:32
2661 forum posts
294 photos

I think the last few posters are missing the point of computer AI.

1) In the case of issues at the traffic lights the simplified options were to race ahead or to wait and take a gap behind. Racing ahead worked this time (and my EV could have beaten the lot off the lights if I'd wanted to) but was based in male agression and risked causing a road rage incident. Doubtless AI would have learned that it's statistically better to arrive later and less often undamaged than take those risks. Indeed AI might have learned that such extra lanes are better not used and just queued to being with... patiently
As far as the piloting of aircraft with boeing's latest problem - pilots on board didn't help due to the engineering error in software.

2)AI developed for driving isn't designed for marriage counselling any more than your average HGV driver is trained for that role. It's likely that such counselling has it's own inherent rules, sympathies and persuasions that can be turned into algorithm's and if inserted into an autonomous HGV's database it could do both roles. AI improves with iterations (as do people with experience)

3) Madame Bovary. You went through several lessons at school being taught the sort of approach to analysing text and the sort of things examiners would be looking at to give marks. It had rules. Just consider the vast improvements that have occurred in voice recognition and translation software over the last decade. There has been software for a long time that can change a piece of prose into a particular authors style since authors have an (evolving) depth of vocabulary, individual understanding of grammar and punctuation, degree of verbosity, pretension etc. AI would at least have had a total memory of that book rather than limited human recall and a complete dictionary.

AI is still evolving and of course has limitations - so do people. AI is a tool and should be accepted as such. You don't expect an HGV to race at le Mans or an F1 car to carry a shipping container or tow a plough - those engineering marvels have been developed for a specific task and took a long time to be developed to do that and are still being tweaked. Indeed much of the advance is down to computer simulations - trial and error

Michael Gilligan14/07/2019 06:51:04
avatar
23121 forum posts
1360 photos
Posted by pgk pgk on 14/07/2019 06:01:32:

I think the last few posters are missing the point of computer AI.

1) In the case of issues at the traffic lights the simplified options were to race ahead or to wait and take a gap behind. Racing ahead worked this time (and my EV could have beaten the lot off the lights if I'd wanted to) but was based in male agression and risked causing a road rage incident. Doubtless AI would have learned that it's statistically better to arrive later and less often undamaged than take those risks. Indeed AI might have learned that such extra lanes are better not used and just queued to being with... patiently

 

 

.

Apologies if I am 'missing the point' ... but I described briefly [and only from my perspective] a real incident, that occurred at a real, recently built, road junction.

  • If I had braked firmly to let the van have his way, someone may have piled into the back of me [there was no-one behind me when we left the lights, but ... who knows?]
  • Yes, I could [and probably should] have simply taken third position in an orderly queue; but that would have meant not using the road as it was clearly intended
  • I was in no hurry, and had nothing to 'prove' [I was driving my Wife to a meeting]
  • As for 'male aggression' ... I don't know: There was no aggression on my part, it was just a 'survival' choice, and I didn't have the time or inclination to check the gender of the other two drivers

In the future, roads may be designed so as to preclude such incidents; and the vehicles may all have appropriate algorithms to preserve themselves and their passengers ... but during the 'implementation period' things will remain complex on our roads.

I was trying to give an example for analysis and discussion; but evidently my first few words were the appropriate ones.

MichaelG.

secret

Edited By Michael Gilligan on 14/07/2019 06:51:44

Michael Horner14/07/2019 07:12:10
229 forum posts
63 photos
Posted by Michael Gilligan on 13/07/2019 23:30:27:

The exit lanes merge into one after a reasonably short distance [the implied rule being that the two lines of vehicles will take their places alternately].

MichaelG.

Hi Michael

You answered your own question.

This would be my take on it, unless they are going to program in male aggression and selfishness.

The cars would agree amongst themselves who would be first away from the lights.

Cheers Michael

pgk pgk14/07/2019 07:31:38
2661 forum posts
294 photos
Posted by Michael Gilligan on 14/07/2019 06:51:0

Apologies if I am 'missing the point' ... but I described briefly [and only from my perspective] a real incident, that occurred at a real, recently built, road junction.

  • If I had braked firmly to let the van have his way, someone may have piled into the back of me [there was no-one behind me when we left the lights, but ... who knows?]
  • Yes, I could [and probably should] have simply taken third position in an orderly queue; but that would have meant not using the road as it was clearly intended
  • I was in no hurry, and had nothing to 'prove' [I was driving my Wife to a meeting]
  • As for 'male aggression' ... I don't know: There was no aggression on my part, it was just a 'survival' choice, and I didn't have the time or inclination to check the gender of the other two drivers

In the future, roads may be designed so as to preclude such incidents; and the vehicles may all have appropriate algorithms to preserve themselves and their passengers ... but during the 'implementation period' things will remain complex on our roads.

I was trying to give an example for analysis and discussion; but evidently my first few words were the appropriate ones.

MichaelG.

secret

Edited By Michael Gilligan on 14/07/2019 06:51:44

I've often wondered why they put that extra lane in. It serves two purposes, I suppose, that it shortens the length of queue at the lights and presumably was originally intended for ocassional heavy slow stuff to use nearside and allow nippy stuff to pass as the lights change. We've all experienced that two lane scenario coming up to a roundabout with an HGV on the inside and having to really keep a look-out that he has enough room for the curve of the roundabout without squeezing us...

If that lane is really intended as a neat and polite alternate car feed-in then it has little benefit to an orderly queue and you'd think it'd be documented in the highway code (haven't looked).

1)You say if you'd braked someone may have hit you from behind. Yes I accept that in a real emergecy stop there may not be time to check mirrors - at least an autonomous vehicle keeps that info updated at all times - but also one hopes/expects that any driver behind you would be traffic aware and paying attention to your situation. If the vehicle behind was autonomous then it'd respond to your braking faster than a human reaction.

2)We have some variance on the possible reasons for that lane. I'm not so clear on why.

3)I wasn't trying to suggest you had anything to 'prove' as you put it but evidently your interpretation of queue rules was different to the van drivers and you felt you had right of way.

4) Apologies if I implied that you were aggressive but you may concede that the whole scenario occurred because primarily the van driver was being pushy while you interpreted the situation as your right of way. Such episodes are the usual initiator of road rage episodes even when it's a simple error by one driver. Hence we should all drive defensively. If id been in that situation then honestly it would be a case of what mood I was in - again something that wouldn't affect AI. If I was in my slow car with poor brakes then I'd have been driving carefully and cautiously.

If i was in my EV with stunning acceleration and hot brakes and genuinely in a hurry or just feeling playful then i admit I might have floored it (my bad) but mostly I drive that with care 'cos it's so sodding expensive and repairs won't be cheap and age has slowed my reflexes - my excuse for buying a car that protects it's occupants that well.

not done it yet14/07/2019 07:33:00
7517 forum posts
20 photos

Questions:

How do auto-piloted vehicles react to a good covering of snow? Do they simply stop?

How about black ice? Can they see it? Do they attempt the impossible(like some drivers on u-tube videos where the vehicle simply accelerates backwards after losing adhesion attempting to drive up a slope?

Perhaps they will simply take to the air in future?

pgk pgk14/07/2019 08:06:17
2661 forum posts
294 photos
Posted by not done it yet on 14/07/2019 07:33:00:

Questions:

How do auto-piloted vehicles react to a good covering of snow? Do they simply stop?

How about black ice? Can they see it? Do they attempt the impossible(like some drivers on u-tube videos where the vehicle simply accelerates backwards after losing adhesion attempting to drive up a slope?

Perhaps they will simply take to the air in future?

At the current level of autonomy of my car it will stay in self-steer if it can determine where the road is and it does that buy assessing edges and markings and making some limited assumptions. I have used self-drive in lousy rainy night conditions and monitored it's display when it's cameras were clearly seeing better than i could but in complelety covered road markings and snow drifts or confusing (to it) road markings or no road markings etc then at the present state of AI it won't self drive.

How do you determine black ice? It's a combination of temperature, colour, reflectance and doubtless other factors AI will learn (really that means be taught by presenting it with multiple sets of data to sort out) what ators affect traction, cargo load and slope angle/camber. Your car at the moment has traction controls built in to assist.

Elon Musk claims that the new Roadster will be available with a spaceX package option to give rocket thrust for acceleration and directional control and possibly short flight. He does make these wild promises and often they come to pass - somewhat later that everyone hoped. The limits of road car acceleration is down to tyre grip; rockets byass that.

pgk pgk14/07/2019 08:34:46
2661 forum posts
294 photos
Posted by Barnaby Wilde on 14/07/2019 08:14:18:...

What the computer running a programme can NEVER do is to question itself whether it wants to play ball.

Why not? What are the determinants? Social pressures, time of the month, work-load, sickness, muscle strain? I accept people and animals are hugely complex and the task to create an imitation of our behavious is huge - should one ever actually want to do it (rather pointless when people are cheap to make and easily thrown away) - then i dare say a simulation could be developed.....

J Hancock14/07/2019 09:51:43
869 forum posts

Call me 'old-fashioned' but I don't really want to participate in this futuristic world you have described.

Reads utterly boring , like being the Mekon, sitting on that flying saucer thing it used to move around on.

Nope, 1964 ish was nirvana, downhill ever since.

SillyOldDuffer14/07/2019 10:30:45
10668 forum posts
2415 photos
Posted by Michael Gilligan on 14/07/2019 06:51:04:
Posted by pgk pgk on 14/07/2019 06:01:32:

I think the last few posters are missing the point of computer AI.

1) In the case of issues at the traffic lights the simplified options were to race ahead or to wait and take a gap behind. Racing ahead worked this time (and my EV could have beaten the lot off the lights if I'd wanted to) but was based in male agression and risked causing a road rage incident. Doubtless AI would have learned that it's statistically better to arrive later and less often undamaged than take those risks. Indeed AI might have learned that such extra lanes are better not used and just queued to being with... patiently

.

Apologies if I am 'missing the point' ... but I described briefly [and only from my perspective] a real incident, that occurred at a real, recently built, road junction.

  • If I had braked firmly to let the van have his way, someone may have piled into the back of me [there was no-one behind me when we left the lights, but ... who knows?]
  • Yes, I could [and probably should] have simply taken third position in an orderly queue; but that would have meant not using the road as it was clearly intended
  • I was in no hurry, and had nothing to 'prove' [I was driving my Wife to a meeting]
  • As for 'male aggression' ... I don't know: There was no aggression on my part, it was just a 'survival' choice, and I didn't have the time or inclination to check the gender of the other two drivers

In the future, roads may be designed so as to preclude such incidents; and the vehicles may all have appropriate algorithms to preserve themselves and their passengers ... but during the 'implementation period' things will remain complex on our roads.

I was trying to give an example for analysis and discussion; but evidently my first few words were the appropriate ones.

MichaelG.

secret

...

It's very useful to explore scenarios like this and score what humans do against the autonomous vehicle. In the example, I think autonomous has the advantage in this particular example:

  • It understands the rule of the road (merge) when real drivers may not, and would try to cooperate with other traffic.
  • It has no emotional response; it won't tailgate because it's busting for a pee, or behave aggressively due to what happened 5 minutes ago, or deliberately ram a police car.
  • It knows exactly what is in front, behind and to the sides with millisecond accuracy. It also knows where it is within a few metres, the temperature, risk of ice, and can detect almost instantly when the wheels slip for whatever cause. Humans are at least 0.2 seconds behind reality, and much worse if the check involves looking in a mirror, during which time they go blind in front. A sneeze loses about 2 seconds. Most drivers are bad on ice unless they've been trained, and even good ones react slowly.
  • In court, after the accident, the human driver has to explain himself. He is an unreliable witness compared with the scrupulously accurate record kept by the autonomous vehicle. Quite likely I think, the autonomous vehicle's data would normally prove the human driver was at fault, much as Black Box recorders show most air crashes to be caused by pilot error. I don't think pleading innocence because the car's AI system can't resolve marriage disputes or analyse Flaubert will impress judges or insurance companies!

fatalplane.jpg

Note that, apart from outright pilot error being number one, pilot error appears again in 'Pilot Error (Weather Related) and 'Pilot Error (Mechanical related)

Here's one where a human might do better. You're driving at speed up an empty road where temporary traffic lights are protecting the movements of a digger just out-of-sight round the corner. The lights have failed. A human might spot the hazard by recognising the in-position lights are not Red, Green or Amber. He should slow down. As the car's map is wrong it has to rely on its sensors and might be travelling too fast to stop in time. But, as I noted earlier, once a hazard is detected, the car will react much faster than a human. We can rely on them to do super-human emergency stops. Maybe even in this situation human and car might score about the same.

On the other hand, autonomous cars wouldn't deliberately or accidentally jump lights as most people do at one time or another. In my experience bad drivers are more common than broken traffic lights!

At root I think autonomous cars are an engineering problem with engineering answers. It's not essential for drivers to be human.

Dave

Mike Poole14/07/2019 15:09:06
avatar
3676 forum posts
82 photos

I look forward to trucks being under autonomous control, my trips to Italy last year and this year have both encountered trucks that wander out of their lane, last year I was very close to being the meat in a truck and Armco sandwich hard braking kept me out of a very quickly diminishing gap, the truck driver apologised with a wave and lights flash but it seems there are too many tired or bored drivers in charge of big trucks.

Mike

Howard Lewis14/07/2019 17:02:11
7227 forum posts
21 photos

Maybe the truck drivers have been taught by many of the UK car drivers, who seem to be unaware of what is around them. In the last 12 months, three rimes nearly run off the road by the car alongside changing lanes without looking!

Being a control freak, and having been used to commercial vehicles, don't like the thought of my driving itself, but autonomous vehicles may be safer than some on the roads.

Howard

All Topics | Latest Posts

This thread is closed.

Magazine Locator

Want the latest issue of Model Engineer or Model Engineers' Workshop? Use our magazine locator links to find your nearest stockist!

Find Model Engineer & Model Engineers' Workshop

Sign up to our Newsletter

Sign up to our newsletter and get a free digital issue.

You can unsubscribe at anytime. View our privacy policy at www.mortons.co.uk/privacy

Latest Forum Posts
Support Our Partners
cowells
Sarik
MERIDIENNE EXHIBITIONS LTD
Subscription Offer

Latest "For Sale" Ads
Latest "Wanted" Ads
Get In Touch!

Do you want to contact the Model Engineer and Model Engineers' Workshop team?

You can contact us by phone, mail or email about the magazines including becoming a contributor, submitting reader's letters or making queries about articles. You can also get in touch about this website, advertising or other general issues.

Click THIS LINK for full contact details.

For subscription issues please see THIS LINK.

Digital Back Issues

Social Media online

'Like' us on Facebook
Follow us on Facebook

Follow us on Twitter
 Twitter Logo

Pin us on Pinterest

 

Donate

donate