Have you ever dri­ven a car with ADAS fea­tures like Adap­ti­ve Crui­se Con­troll and/or Lane Kee­ping Assistant? Sca­ry, right? It takes some time to trust the sys­tem and relax. Ima­gi­ne being respon­si­ble for the error-free func­tio­ning of such a sys­tem. In this inter­view you will learn how this chall­enge is being tack­led by Code­la­b’s ADAS test­ing team.

Q: How does the ADAS test­ing look nowadays?
A: Today we have powerful tools to simu­la­te road envi­ron­ment and we can check most of the situa­tions that ADAS can face on the road. As tes­ters we get access to big amount of docu­men­ta­ti­on for auto­mo­ti­ve test­ing for exam­p­le: IST­QB auto­mo­ti­ve soft­ware tes­ter docu­ments that can give us a lot of infor­ma­ti­on how we should test ADAS sys­tems and requi­re­ments which need to be cover­ed by test scenarios.
ADAS test­ing is full of auto­ma­tic tests run­ned in spe­cia­li­stic tools which are able to simu­la­te situa­ti­on on road, e.g. Car­Ma­ker. We are able to design and imple­ment dif­fe­rent situa­tions which could hap­pen in the real live.

Q: What was the goal of your ADAS test­ing acti­vi­ties in the last project?
A: One of the pro­jects that we were invo­led in, was to deve­lop a secon­da­ry emer­gen­cy sys­tem that would take con­trol over the car when the pri­ma­ry sys­tem breaks down. Our team was working on deve­lo­p­ment of auto­ma­ted tests for Soft­ware Qua­li­fi­ca­ti­ons Tests (SWE.6 level accor­ding to Auto­mo­ti­ve SPI­CE). We simu­la­ted beha­viour of the car in Car­Ma­ker by IPG and che­cked whe­ther the car beha­ves accor­ding to the SW requi­re­ments spe­ci­fi­ca­ti­on. Addi­tio­nal­ly we imple­men­ted (in Jenk­ins and Python) the who­le test­ing infra­struc­tu­re that allo­wed to exe­cu­te tests and pro­vi­de their results in auto­ma­ted way. Such approach sped up the test­ing pro­cess and hel­ped fixing SW pro­blems in a fas­ter time.

Q: How did you get it done? Can you descri­be your guidelines/process that hel­ped you to achie­ve this goal for ADAS systems?
A: To make all of that run­ning we were working accor­ding to the fol­lo­wing pro­cess / steps:

  • Soft­ware and Sys­tem (SYS.2 & SWE.1) requi­re­ments defi­ni­ti­on and review
  • Test cases defi­ni­ti­on in Pola­ri­on and auto­ma­ti­on in Car­Ma­ker by IPG
  • Rea­liza­ti­on of full scope of SWE.5 & SWE.6 accor­ding to aSPI­CE, incl.: 
    • Qua­li­ty reviews
    • Qua­li­ty gate assessment
    • Pro­cess defi­ni­ti­on (Soft­ware Test Plan pre­pa­ra­ti­on) and improvements
  • Test results ana­ly­sis and pro­blem reporting
  • Fault Injec­tion tool­set creation
  • ADAS KPIs and Safe­ty Score verification
  • Auto­no­mous dri­ving fea­tures ana­ly­sis and tests execution
  • ADAS algo­rith­ms and dri­ves simu­la­ti­on in Car­Ma­ker by IPG

Q: What are the main fea­tures which need to be tested?
A:
The­re are a bunch of fea­tures we usual­ly test, such as but not limi­t­ed to:

  • Nomi­nal Beha­vi­or (kee­ping dedi­ca­ted dece­le­ra­ti­on and lane)
  • Fol­low Mode (will fol­low the vehic­le in front and keep the appro­pria­te distance)
  • Cut in/out (vehic­le using the road of the Ego vehicle)
  • Road Beha­vi­or test­ing: simu­la­ting: Lane Chan­ge Initia­ti­on / Com­ple­ti­on / Abortion
  • Emer­gen­cy Bra­ke (simu­la­ting emer­gen­cy bra­king in an emer­gen­cy in order not to lead to a col­li­si­on, or to lead to a col­li­si­on with the lowest pos­si­ble force)
  • Sys­tem Degradation
  • Mode Mana­ger
  • Manu­al Risk Maneu­ver (cas­ca­de testing)
  • Tra­jec­to­ry­Va­li­da­tor (test­ing the tra­jec­to­ry of the vehicle)
  • Vul­nerable Road Users (VRU) – motor­cy­cle, cyclist, pede­stri­an, ani­mal on the road
  • Risk Zone

Q: Which levels of ASIL whe­re tested?
A: The main level of ASIL we have tes­ted was ASIL‑D respon­si­ble for the acti­ve safe­ty sys­tem coope­ra­ting with the stee­ring and bra­king sys­tem of the vehic­le and the auto­ma­ted dri­ving system.

Q: What kind of tools have been used and what kind of advan­ta­ges do the­se tool have?
A: As a team we work­ed in SCRUM metho­do­lo­gy and we used JIRA and Con­fluence as main “scrum” tools. Tho­se two allo­wed us to ful­ly mana­ge a Pro­duct and Sprint Back­log and addi­tio­nal­ly store the who­le pro­ject know­ledge (manu­als, theo­ry, imple­men­ta­ti­on examp­les, reports, etc).
JIRA and Con­fluence are very good, intui­ti­ve tools and working with them was pleasure.
Addi­tio­nal tool that was used for mana­ging a pro­ject docu­men­ta­ti­on was Pola­ri­on. This is a pret­ty popu­lar tool that sup­ports work in auto­mo­ti­ve pro­jects accor­ding to the ASPI­CE rules. In this tool we stored such docu­ments like sys­tem requi­re­ments spe­ci­fi­ca­ti­on, SW requi­re­ments spe­ci­fi­ca­ti­on, archi­tec­tu­re design and also crea­ted by us soft­ware test spe­ci­fi­ca­ti­ons. This tool hel­ped in making a bila­te­ral trace­bi­li­ty, howe­ver working with this tool was very annoy­ing and some­ti­mes very slow. Pro­ba­b­ly pushing all of tho­se to the Atlas­si­an packa­ge would help a lot.
All our tests were imple­men­ted in Car­Ma­ker – that tool gives us pos­si­bi­li­ty to simu­la­te road envi­ron­ment. Thanks to Car­Ma­ker we were able to: use dif­fe­rent types of cars as traf­fic and we got the pos­si­bi­li­ty to simu­la­te their beha­vi­or, use other traf­fic par­ti­ci­pan­ts (bikes, motor­cy­cles, peo­p­le, ani­mals), set road pro­per­ties such as: num­ber of lines, type (straight, cur­ve), lines (solid, dot­ted), set dif­fe­rent types of wea­ther. Car­Ma­ker also gives us pos­si­bi­li­ty to see or record our tests that was very hel­pful during fur­ther report­ing of pro­blems and analysis.

Q: Did you imple­ment auto­ma­ted tests which veri­fied the ADAS solutions?
Yes, actua­ly we have all test cases auto­ma­ted. It is the fun­da­men­tal of the test­ing nowa­days also in ADAS. In last pro­ject we had auto­ma­ted Jobs on Jenk­ins which veri­fied the tests results after soft­ware upgrade.
We used Python scripts for deve­lo­p­ment of test auto­ma­ti­on, test report crea­ti­on, test ana­ly­sis and video ana­ly­sis. Auto­ma­ti­on of (our input/output date) com­mu­ni­ca­ti­on with Pola­ri­on sys­tem was imple­men­ted by Pola­ri­on API.

Q: How was ISO 26262 imple­men­ted in the test procedures?
A: ISO 26262 in test pro­ce­du­res was imple­men­ted by using the ASIL‑D motor risk clas­si­fi­ca­ti­on (which is a major part of this stan­dard). This stan­dard dealt with the func­tion­al safe­ty requi­re­ments for various elec­tri­cal and elec­tro­nic sys­tems. The tes­ted safe­ty sys­tems were respon­si­ble for asses­sing the risk that appeared on the road and the appro­pria­te coun­ter­ac­tively unde­si­ra­ble effects of a col­li­si­on. The goal was the safe­ty of all road users.
As tes­ters we were invol­ved in two pha­ses of safe­ty life­cy­cle: Pro­duct con­cept and Pro­duct deve­lo­p­ment. We used the fol­lo­wing methods of the ISO 26262:

  • Test design tech­ni­ques – we used equi­va­lence par­ti­tio­ning, boun­da­ry value analysis
  • Tech­ni­ques of the test exe­cu­ti­on – we simu­la­te road envi­ron­ment and we check beha­vi­or of our system
  • Test envi­ron­ments – We were using Soft­ware in the Loop (SiL)
  • Sta­tic test tech­ni­ques – e.g. we were invol­ved in revie­w­ing requirements

Q: What was the big­gest chall­enge which you faced in ADAS testing?
A:
To ful­ly under­stand the requi­re­ments and ope­ra­ti­on of the sys­tem while esti­mat­ing the scope of work quar­ter­ly in advance.

Q: What are the upco­ming trends to test­ing ADAS systems?
A: In our opi­ni­on, the­re will be bet­ter and more powerful simu­la­ti­on tools. We would see it like pre­de­fi­ned road situa­tions, even based on some sta­tis­tics, law regu­la­ti­ons and then the test engi­neer would focus only on cor­rect mea­su­res. Such road situa­tions could be used as a refe­rence to pro­ve that the beha­viour of the car is correct.