Have you ever driv­en a car with ADAS fea­tures like Adapt­ive Cruise Con­troll and/or Lane Keep­ing Assist­ant? Scary, right? It takes some time to trust the sys­tem and relax. Ima­gine being respons­ible for the error-free func­tion­ing of such a sys­tem. In this inter­view you will learn how this chal­lenge is being tackled by Codelab’s ADAS test­ing team.

Q: How does the ADAS test­ing look nowadays?
A: Today we have power­ful tools to sim­u­late road envir­on­ment and we can check most of the situ­ations that ADAS can face on the road. As test­ers we get access to big amount of doc­u­ment­a­tion for auto­mot­ive test­ing for example: ISTQB auto­mot­ive soft­ware test­er doc­u­ments that can give us a lot of inform­a­tion how we should test ADAS sys­tems and require­ments which need to be covered by test scenarios.
ADAS test­ing is full of auto­mat­ic tests runned in spe­cial­ist­ic tools which are able to sim­u­late situ­ation on road, e.g. Car­Maker. We are able to design and imple­ment dif­fer­ent situ­ations which could hap­pen in the real live.

Q: What was the goal of your ADAS test­ing activ­it­ies in the last project?
A: One of the pro­jects that we were involed in, was to devel­op a sec­ond­ary emer­gency sys­tem that would take con­trol over the car when the primary sys­tem breaks down. Our team was work­ing on devel­op­ment of auto­mated tests for Soft­ware Qual­i­fic­a­tions Tests (SWE.6 level accord­ing to Auto­mot­ive SPICE). We sim­u­lated beha­viour of the car in Car­Maker by IPG and checked wheth­er the car behaves accord­ing to the SW require­ments spe­cific­a­tion. Addi­tion­ally we imple­men­ted (in Jen­kins and Python) the whole test­ing infra­struc­ture that allowed to execute tests and provide their res­ults in auto­mated way. Such approach sped up the test­ing pro­cess and helped fix­ing SW prob­lems in a faster time.

Q: How did you get it done? Can you describe your guidelines/process that helped you to achieve this goal for ADAS systems?
A: To make all of that run­ning we were work­ing accord­ing to the fol­low­ing pro­cess / steps:

  • Soft­ware and Sys­tem (SYS.2 & SWE.1) require­ments defin­i­tion and review
  • Test cases defin­i­tion in Polari­on and auto­ma­tion in Car­Maker by IPG
  • Real­iz­a­tion of full scope of SWE.5 & SWE.6 accord­ing to aSPICE, incl.: 
    • Qual­ity reviews
    • Qual­ity gate assessment
    • Pro­cess defin­i­tion (Soft­ware Test Plan pre­par­a­tion) and improvements
  • Test res­ults ana­lys­is and prob­lem reporting
  • Fault Injec­tion tool­set creation
  • ADAS KPIs and Safety Score verification
  • Autonom­ous driv­ing fea­tures ana­lys­is and tests execution
  • ADAS algorithms and drives sim­u­la­tion in Car­Maker by IPG

Q: What are the main fea­tures which need to be tested?
A:
There are a bunch of fea­tures we usu­ally test, such as but not lim­ited to:

  • Nom­in­al Beha­vi­or (keep­ing ded­ic­ated decel­er­a­tion and lane)
  • Fol­low Mode (will fol­low the vehicle in front and keep the appro­pri­ate distance)
  • Cut in/out (vehicle using the road of the Ego vehicle)
  • Road Beha­vi­or test­ing: sim­u­lat­ing: Lane Change Ini­ti­ation / Com­ple­tion / Abortion
  • Emer­gency Brake (sim­u­lat­ing emer­gency brak­ing in an emer­gency in order not to lead to a col­li­sion, or to lead to a col­li­sion with the low­est pos­sible force)
  • Sys­tem Degradation
  • Mode Man­ager
  • Manu­al Risk Man­euver (cas­cade testing)
  • Tra­ject­oryVal­id­at­or (test­ing the tra­ject­ory of the vehicle)
  • Vul­ner­able Road Users (VRU) — motor­cycle, cyc­list, ped­es­tri­an, anim­al on the road
  • Risk Zone

Q: Which levels of ASIL where tested?
A: The main level of ASIL we have tested was ASIL‑D respons­ible for the act­ive safety sys­tem cooper­at­ing with the steer­ing and brak­ing sys­tem of the vehicle and the auto­mated driv­ing system.

Q: What kind of tools have been used and what kind of advant­ages do these tool have?
A: As a team we worked in SCRUM meth­od­o­logy and we used JIRA and Con­flu­ence as main “scrum” tools. Those two allowed us to fully man­age a Product and Sprint Back­log and addi­tion­ally store the whole pro­ject know­ledge (manu­als, the­ory, imple­ment­a­tion examples, reports, etc).
JIRA and Con­flu­ence are very good, intu­it­ive tools and work­ing with them was pleasure.
Addi­tion­al tool that was used for man­aging a pro­ject doc­u­ment­a­tion was Polari­on. This is a pretty pop­u­lar tool that sup­ports work in auto­mot­ive pro­jects accord­ing to the ASPICE rules. In this tool we stored such doc­u­ments like sys­tem require­ments spe­cific­a­tion, SW require­ments spe­cific­a­tion, archi­tec­ture design and also cre­ated by us soft­ware test spe­cific­a­tions. This tool helped in mak­ing a bilat­er­al trace­bil­ity, how­ever work­ing with this tool was very annoy­ing and some­times very slow. Prob­ably push­ing all of those to the Atlas­si­an pack­age would help a lot.
All our tests were imple­men­ted in Car­Maker – that tool gives us pos­sib­il­ity to sim­u­late road envir­on­ment. Thanks to Car­Maker we were able to: use dif­fer­ent types of cars as traffic and we got the pos­sib­il­ity to sim­u­late their beha­vi­or, use oth­er traffic par­ti­cipants (bikes, motor­cycles, people, anim­als), set road prop­er­ties such as: num­ber of lines, type (straight, curve), lines (sol­id, dot­ted), set dif­fer­ent types of weath­er. Car­Maker also gives us pos­sib­il­ity to see or record our tests that was very help­ful dur­ing fur­ther report­ing of prob­lems and analysis.

Q: Did you imple­ment auto­mated tests which veri­fied the ADAS solutions?
Yes, actualy we have all test cases auto­mated. It is the fun­da­ment­al of the test­ing nowadays also in ADAS. In last pro­ject we had auto­mated Jobs on Jen­kins which veri­fied the tests res­ults after soft­ware upgrade.
We used Python scripts for devel­op­ment of test auto­ma­tion, test report cre­ation, test ana­lys­is and video ana­lys­is. Auto­ma­tion of (our input/output date) com­mu­nic­a­tion with Polari­on sys­tem was imple­men­ted by Polari­on API.

Q: How was ISO 26262 imple­men­ted in the test procedures?
A: ISO 26262 in test pro­ced­ures was imple­men­ted by using the ASIL‑D motor risk clas­si­fic­a­tion (which is a major part of this stand­ard). This stand­ard dealt with the func­tion­al safety require­ments for vari­ous elec­tric­al and elec­tron­ic sys­tems. The tested safety sys­tems were respons­ible for assess­ing the risk that appeared on the road and the appro­pri­ate coun­ter­act­ively undesir­able effects of a col­li­sion. The goal was the safety of all road users.
As test­ers we were involved in two phases of safety life­cycle: Product concept and Product devel­op­ment. We used the fol­low­ing meth­ods of the ISO 26262:

  • Test design tech­niques – we used equi­val­ence par­ti­tion­ing, bound­ary value analysis
  • Tech­niques of the test exe­cu­tion – we sim­u­late road envir­on­ment and we check beha­vi­or of our system
  • Test envir­on­ments – We were using Soft­ware in the Loop (SiL)
  • Stat­ic test tech­niques – e.g. we were involved in review­ing requirements

Q: What was the biggest chal­lenge which you faced in ADAS testing?
A:
To fully under­stand the require­ments and oper­a­tion of the sys­tem while estim­at­ing the scope of work quarterly in advance.

Q: What are the upcom­ing trends to test­ing ADAS systems?
A: In our opin­ion, there will be bet­ter and more power­ful sim­u­la­tion tools. We would see it like pre­defined road situ­ations, even based on some stat­ist­ics, law reg­u­la­tions and then the test engin­eer would focus only on cor­rect meas­ures. Such road situ­ations could be used as a ref­er­ence to prove that the beha­viour of the car is correct.