Have you ever dri­ven a car with ADAS featu­res like Adap­ti­ve Cru­ise Con­troll and/or Lane Keeping Assi­stant? Sca­ry, right? It takes some time to trust the sys­tem and relax. Ima­gi­ne being respon­si­ble for the error-free func­tio­ning of such a sys­tem. In this inte­rview you will learn how this chal­len­ge is being tac­kled by Code­la­b’s ADAS testing team.

Q: How does the ADAS testing look nowadays?
A: Today we have power­ful tools to simu­la­te road envi­ron­ment and we can check most of the situ­ations that ADAS can face on the road. As testers we get access to big amo­unt of docu­men­ta­tion for auto­mo­ti­ve testing for exam­ple: ISTQB auto­mo­ti­ve softwa­re tester docu­ments that can give us a lot of infor­ma­tion how we sho­uld test ADAS sys­tems and requ­ire­ments which need to be cove­red by test scenarios.
ADAS testing is full of auto­ma­tic tests run­ned in spe­cia­li­stic tools which are able to simu­la­te situ­ation on road, e.g. Car­Ma­ker. We are able to design and imple­ment dif­fe­rent situ­ations which could hap­pen in the real live.

Q: What was the goal of your ADAS testing acti­vi­ties in the last project?
A: One of the pro­jects that we were invo­led in, was to deve­lop a secon­da­ry emer­gen­cy sys­tem that would take con­trol over the car when the pri­ma­ry sys­tem bre­aks down. Our team was wor­king on deve­lop­ment of auto­ma­ted tests for Softwa­re Quali­fi­ca­tions Tests (SWE.6 level accor­ding to Auto­mo­ti­ve SPI­CE). We simu­la­ted beha­vio­ur of the car in Car­Ma­ker by IPG and chec­ked whe­ther the car beha­ves accor­ding to the SW requ­ire­ments spe­ci­fi­ca­tion. Addi­tio­nal­ly we imple­men­ted (in Jen­kins and Python) the who­le testing infra­struc­tu­re that allo­wed to exe­cu­te tests and pro­vi­de the­ir results in auto­ma­ted way. Such appro­ach sped up the testing pro­cess and hel­ped fixing SW pro­blems in a faster time.

Q: How did you get it done? Can you descri­be your guidelines/process that hel­ped you to achie­ve this goal for ADAS systems?
A: To make all of that run­ning we were wor­king accor­ding to the fol­lo­wing pro­cess / steps:

  • Softwa­re and Sys­tem (SYS.2 & SWE.1) requ­ire­ments defi­ni­tion and review
  • Test cases defi­ni­tion in Pola­rion and auto­ma­tion in Car­Ma­ker by IPG
  • Reali­za­tion of full sco­pe of SWE.5 & SWE.6 accor­ding to aSPI­CE, incl.: 
    • Quali­ty reviews
    • Quali­ty gate assessment
    • Pro­cess defi­ni­tion (Softwa­re Test Plan pre­pa­ra­tion) and improvements
  • Test results ana­ly­sis and pro­blem reporting
  • Fault Injec­tion tool­set creation
  • ADAS KPIs and Safe­ty Sco­re verification
  • Auto­no­mo­us dri­ving featu­res ana­ly­sis and tests execution
  • ADAS algo­ri­thms and dri­ves simu­la­tion in Car­Ma­ker by IPG

Q: What are the main featu­res which need to be tested?
A:
The­re are a bunch of featu­res we usu­al­ly test, such as but not limi­ted to:

  • Nomi­nal Beha­vior (keeping dedi­ca­ted dece­le­ra­tion and lane)
  • Fol­low Mode (will fol­low the vehic­le in front and keep the appro­pria­te distance)
  • Cut in/out (vehic­le using the road of the Ego vehicle)
  • Road Beha­vior testing: simu­la­ting: Lane Chan­ge Ini­tia­tion / Com­ple­tion / Abortion
  • Emer­gen­cy Bra­ke (simu­la­ting emer­gen­cy bra­king in an emer­gen­cy in order not to lead to a col­li­sion, or to lead to a col­li­sion with the lowest possi­ble force)
  • Sys­tem Degradation
  • Mode Mana­ger
  • Manu­al Risk Maneu­ver (casca­de testing)
  • Tra­jec­to­ry­Va­li­da­tor (testing the tra­jec­to­ry of the vehicle)
  • Vul­ne­ra­ble Road Users (VRU) — motor­cyc­le, cyc­list, pede­strian, ani­mal on the road
  • Risk Zone

Q: Which levels of ASIL whe­re tested?
A: The main level of ASIL we have tested was ASIL‑D respon­si­ble for the acti­ve safe­ty sys­tem coope­ra­ting with the ste­ering and bra­king sys­tem of the vehic­le and the auto­ma­ted dri­ving system.

Q: What kind of tools have been used and what kind of advan­ta­ges do the­se tool have?
A: As a team we wor­ked in SCRUM metho­do­lo­gy and we used JIRA and Con­flu­en­ce as main “scrum” tools. Tho­se two allo­wed us to ful­ly mana­ge a Pro­duct and Sprint Bac­klog and addi­tio­nal­ly sto­re the who­le pro­ject know­led­ge (manu­als, the­ory, imple­men­ta­tion exam­ples, reports, etc).
JIRA and Con­flu­en­ce are very good, intu­iti­ve tools and wor­king with them was pleasure.
Addi­tio­nal tool that was used for mana­ging a pro­ject docu­men­ta­tion was Pola­rion. This is a pret­ty popu­lar tool that sup­ports work in auto­mo­ti­ve pro­jects accor­ding to the ASPI­CE rules. In this tool we sto­red such docu­ments like sys­tem requ­ire­ments spe­ci­fi­ca­tion, SW requ­ire­ments spe­ci­fi­ca­tion, archi­tec­tu­re design and also cre­ated by us softwa­re test spe­ci­fi­ca­tions. This tool hel­ped in making a bila­te­ral tra­ce­bi­li­ty, howe­ver wor­king with this tool was very annoy­ing and some­ti­mes very slow. Pro­ba­bly pushing all of tho­se to the Atlas­sian pac­ka­ge would help a lot.
All our tests were imple­men­ted in Car­Ma­ker – that tool gives us possi­bi­li­ty to simu­la­te road envi­ron­ment. Thanks to Car­Ma­ker we were able to: use dif­fe­rent types of cars as traf­fic and we got the possi­bi­li­ty to simu­la­te the­ir beha­vior, use other traf­fic par­ti­ci­pants (bikes, motor­cyc­les, people, ani­mals), set road pro­per­ties such as: num­ber of lines, type (stra­ight, curve), lines (solid, dot­ted), set dif­fe­rent types of weather. Car­Ma­ker also gives us possi­bi­li­ty to see or record our tests that was very help­ful during fur­ther repor­ting of pro­blems and analysis.

Q: Did you imple­ment auto­ma­ted tests which veri­fied the ADAS solutions?
Yes, actu­aly we have all test cases auto­ma­ted. It is the fun­da­men­tal of the testing nowa­days also in ADAS. In last pro­ject we had auto­ma­ted Jobs on Jen­kins which veri­fied the tests results after softwa­re upgrade.
We used Python scripts for deve­lop­ment of test auto­ma­tion, test report cre­ation, test ana­ly­sis and video ana­ly­sis. Auto­ma­tion of (our input/output date) com­mu­ni­ca­tion with Pola­rion sys­tem was imple­men­ted by Pola­rion API.

Q: How was ISO 26262 imple­men­ted in the test procedures?
A: ISO 26262 in test pro­ce­du­res was imple­men­ted by using the ASIL‑D motor risk clas­si­fi­ca­tion (which is a major part of this stan­dard). This stan­dard dealt with the func­tio­nal safe­ty requ­ire­ments for vario­us elec­tri­cal and elec­tro­nic sys­tems. The tested safe­ty sys­tems were respon­si­ble for asses­sing the risk that appe­ared on the road and the appro­pria­te coun­te­rac­ti­ve­ly unde­si­ra­ble effects of a col­li­sion. The goal was the safe­ty of all road users.
As testers we were invo­lved in two pha­ses of safe­ty life­cyc­le: Pro­duct con­cept and Pro­duct deve­lop­ment. We used the fol­lo­wing methods of the ISO 26262:

  • Test design tech­ni­qu­es – we used equ­iva­len­ce par­ti­tio­ning, boun­da­ry value analysis
  • Tech­ni­qu­es of the test exe­cu­tion – we simu­la­te road envi­ron­ment and we check beha­vior of our system
  • Test envi­ron­ments – We were using Softwa­re in the Loop (SiL)
  • Sta­tic test tech­ni­qu­es – e.g. we were invo­lved in revie­wing requirements

Q: What was the big­gest chal­len­ge which you faced in ADAS testing?
A:
To ful­ly under­stand the requ­ire­ments and ope­ra­tion of the sys­tem whi­le esti­ma­ting the sco­pe of work quar­ter­ly in advance.

Q: What are the upco­ming trends to testing ADAS systems?
A: In our opi­nion, the­re will be bet­ter and more power­ful simu­la­tion tools. We would see it like pre­de­fi­ned road situ­ations, even based on some sta­ti­stics, law regu­la­tions and then the test engi­ne­er would focus only on cor­rect measu­res. Such road situ­ations could be used as a refe­ren­ce to pro­ve that the beha­vio­ur of the car is correct.